Envision glasses visual aid review

blind travels logo and envision glasses product image

We here at Blind Travels have been following the progress of the Envision glasses visual aid for a couple of years now. Recently, we reported on the availability of this new product and some of its key features. The glasses provide users with low or no vision the ability to read text, identify people and objects, and even quickly determine the denominations of currency. Interest in the Envision glasses was growing even during their early development, when they were awarded the Google Play award for accessibility in 2019. We reached out to Envision and they sent us a production unit to test. Does this new visual aid live up to the hype? Let’s talk about it…

Overview

The Envision glasses are an electronic visual aid created by Dutch tech company Envision (https://www.letsenvision.com/). The glasses offer blind and low vision users the ability to read text, identify currency, describe a scene, find objects and people, detect light and color and more. The glasses require a companion smartphone app which is available for iOS and Android. The glasses are user friendly, responsive, and offer excellent text recognition accuracy. Let’s take a deep dive into the look and feature set of this innovative visual aid.

The look

The Envision glasses are built on the Google Glass version 2 platform, and while there was a lot of hype and media coverage around Google glass back in the day, most people did not recognize them when I was wearing them in public. The general reaction I got to the glasses was they look futuristic and cool. I found them comfortable to wear, the frames are metal and remind me of Oakley sunglasses with a very oversized right earpiece.

The glasses have two buttons, one located at the back of the unit, which is for power and wake and another on the top of the hinge where the frames fold. They are were lightweight, stylish, and easy to wear for a full day.  I found them a refreshing change from other visual aids built on larger VR and AR hardware which tend to be large and uncomfortable to wear for long periods – and even worse tend to stand out in a crowd. The Envision glasses look like sunglasses with no lenses, and most people just ignore them.

Audio is delivered via a single speaker in the right earpiece. The sound quality was good, and the voice quality was on par with current technology screen readers. In most cases, I kept the volume around 30% and bystanders could not hear what it was saying, or just heard a low garble from the unit.

Features and Interface

Interaction with the Envision glasses is performed via a swipe pad located on the right earpiece just behind the hinge for the frames of the unit. The swipe pad recognizes single, double, and triple finger gestures, which may sound overwhelming at first, but with just a couple minutes of use I was able to easily navigate the menu structure. Audio descriptions and instructions were clear and concise, and tended to show up just when the user needed them most or required a bit of reminding about the functionality of a feature.

Read

The read menu option offers the user Instant text, scan text and batch scan of documents. Instant text reads any text that appears in front of the unit and is great for reading labels on boxes and bottles. I also found instant text perfectly suited for use at fast food restaurants, and it was very good at reading the menu on the wall above the cashier. Instant text did not work well for reading a menu in a sit-down restaurant. I did find that the tool will often read things that are across the room, which can be a bit confusing when you are holding a box of cereal and it starts reading the label of a can of soup across the kitchen.

Scan text and batch scan work a bit different than instant text; they give audio directions to the user to position the document to be read for optimum accuracy. This feature works great for reading a book, as you hold it up the camera clicks and takes a picture and reads the text. There is usually a short delay while the glasses prepares the text, and the accuracy was very good to excellent. Quite often the unit will spell out a word rather than read it, but that was not a big deal. Batch scan allows you to scan multiple pages into memory at a time and have them read to you. The only downside with the scan text and batch scan feature is that they require a wi-fi connection to operate. The first thing I did when I got the review unit was to go to a restaurant expecting to have the glasses read the menu to me, but since the restaurant lacked a wi-fi connection the feature did not work. I was able to muddle through with the instant text as this feature does not require a wi-fi connection to operate.

Overall, the accuracy when reading text was well above my expectations, a trick I learned was to use a blank piece of paper over the page not being scanned when using the scan text and batch scan options. This seemed to improve the unit’s ability to frame the document a bit faster.

Call an Ally

The next menu option is Call. This feature allows the user to call a pre-approved (done through the app) person for assistance. Using the camera in the glasses, the person on the call can see what you are seeing in real time and help direct you in the event you are lost or cannot find something. To test this functionality, I added my wife and son to the system, which only took a few minutes. It should be noted that this feature only functions when the unit is connected to a wi-fi signal, so when I was at the airport, I had to log into a nearby restaurant wi-fi to accurately test. I hope in future versions of the software that this feature can be activated over the phone connection, as it would then function just like navigation apps like Be My Eyes. 

Identify

Identify is next on the menu, which offers scene description, detect light, identify cash, and detect colors. Describe scene allows you to have the Envision glasses describe what it sees in the scene in front of you. The system seems to do a good job of picking out the most important thing in the scene, if there is a woman sitting in a chair by a desk with a computer on it, and a cat on the floor, the glasses will tell you: “woman sitting in a chair”, or “woman sitting in a chair at a desk”. This feature does require wi-fi, I found the feature useful in home and office settings.   

Detect light uses a tone to inform the user about the amount of light around them. A lower tone denotes a darker ambient light, while a higher tone indicates brighter light. A completely blind friend found this feature useful when preparing for bed, to determine if they had left a light on. The rising tone allowed the user to determine the direction of the light that was still on.

Detect colors performed well when identifying many colors. The glasses accurately identified the change in color of an object as the amount of light present changed. A purple ball was identified as purple with good illumination, but as the light source was moved farther away from the object, the glasses identified different shades of the color until it appeared as dark gray. A really cool feature, and great for exploring your environment. Both the detect color and detect light feature worked without a wi-fi signal. 

Find

We all lose things, and need to locate objects, and the Envision glasses are very good at finding things. The first option in the find menu is find object. The user can scroll through the menu and select the glasses to find a Bench, Bicycle, Bottle, Car, Cat, Chair, Dog, Keyboard, Laptop, Motorbike, Sofa, Table, Toilet, Traffic light and Train. This feature does not require a wi-fi signal, and I found it very useful especially for finding traffic lights, benches, and chairs. When searching for an object, the glasses emit a tone when the object is in front of you, below are some examples illustrating the distance from the object when the “found it” tone was emitted. The glasses continue to emit the tone as long as the object the user is searching for is directly in front of them.

Find people is such a cool feature of the Envision glasses. Users train the glasses to find a specific person with the smartphone app. When training this feature, the app asks for a name, then directs the user to take five pictures of the person’s face. I found a straight on, then left, right top and bottom angles gave excellent results. Once you have taken the photos, the glasses take a couple of minutes to process, then find person will identify that person with near perfect accuracy. Entering a room with a person trained in the system yields a “Looks like (name)”. This feature does not require a wi-fi connection, and I found it to be very accurate even in low light situations.

The Explore also does not require a wi-fi connection and allows the user to explore objects in their surroundings. The function was able to easily identify the objects in the find object list above. This feature is handy if you are in a new room and looking to get the layout of furniture and other things. To get the most out of this feature, the user needs to survey their environment slowly and give the glasses time to identify objects in front of it. If you move too quickly, the glasses will miss things and not read them out to you.

I assume because of the processing power required for the find functions, these features did tend to drain battery life quite a bit quicker than most other features of the glasses. I would recommend a battery pack to charge the unit if you are traveling with the glasses and relying on them to give you the lay of the land in new surroundings. 

Cool features

The Envision glasses are a capable and fun visual aid to use, allowing the visually impaired user a lens into the visual world around them. Much attention to detail has gone into the creation of this unit, one feature I really appreciated was the voice announcing the battery level when charging. The unit also conveys the battery level, time, date and the wi-fi you are currently connected to with a double tap from the home menu location.

When you purchase a pair of Envision glasses, you are scheduled for a free onboarding call with the team which allows you to ask questions and gain some insight into using the glasses to their fullest potential. Take some time to get to know your glasses before the call, the team is very helpful in answering all of the questions you may have about the functionality of the unit. I love that this is a feature for all Envision glasses purchasers, and really sets a new user off on good footing when learning to use a product like this.

New features

Shortly after I received the glasses, a software update came through which included functionality for the glasses to instruct the user in positioning documents for optimum performance when scanning. When attempting to scan a book or document, the glasses tell you to slowly move the page to the left or right, or move your head left or right. Small increments of movement are best here, otherwise the glasses tend to get a bit jumbled. Slow deliberate movements allowed the glasses to instruct and line up the material to be scanned quickly.

A second update during my testing time implemented voice commands, allowing the user to quickly access features like instant text, find object etc. To access the voice commands, the user depresses the hinge button where the large right earpiece attaches to the frame. There is a bit of a time delay when activating the voice commands, which required a bit to get used to. Once you have the timing down for issuing a command they work well even when the unit is not connected to wi-fi. I thought the Envision glasses were great with the manual menu system, as it was easy to navigate, but the addition of the voice commands makes the glasses even more useful in day-to-day use. 

Everything we own these days needs to be updated, and it feels like some of those devices can take ages to complete an update (I’m looking at you, windows), but updating the software on the Envision glasses was quick and the audio instructions were clear and easy to follow. Read more about the team’s commitment to improving the platform below.

Performance

When it comes down to it, how well a product performs its features is directly related to the likelihood of a user continuing to use the product. The Envision glasses handled the tasks I threw at it quickly and accurately. Battery life was good, and I was able to get a day’s worth of use out of a full charge with moderate use. Identifying objects in the find object list was generally quick depending on the scene (as to be expected). If the object the glasses was looking for was by itself, they identified it much quicker than if there were other objects around that could distract the identification process.

Built as a platform

Time to get a bit more technical: we here at Blind Travels love to report on technology related to low vision and blindness, so many readers here have interest in the more technical aspects of a product, and I believe the Envision glasses are well worth delving into. Recently, I spoke with a couple members of the development team about the Envision glasses project, and I was pleasantly surprised to find out that the team is treating the glasses as a platform which can be expanded upon.  

Platform agnostic

The Envision glasses are built on the Google Glass version 2 enterprise edition, but the code running the glasses is platform agnostic. What this means is that if Google decided to stop manufacturing the glass platform, Envision could pivot to another hardware platform without having to start from scratch. The team are forward-looking in terms of project longevity, which means investing in the platform as a customer or a developer appears to be a safe bet.

Developer kits

The core of the software running the Envision glasses is written with platform “wrappers”, meaning porting to support other glasses or AR type glasses hardware is something the team is considering. The team plan to make developer kits (SDK) for the glasses available as early as next year, though which development environments will be supported has not been set in stone as of this writing. I applaud this move, and as we seen with early game consoles, giving independent developers access to the development environment at a reasonable cost allows creation of some pretty cool (and often crazy) advances in the capabilities of a platform.

Future functionality

Envision are currently in talks with the developers of several well-known navigation apps commonly used by low vision and blind users with the hope they will eventually add their robust location and navigation abilities to the Envision glasses platform. 

Product specifications

From the Envision website:

Camera: An 8-MP camera with a wide field-of-view.

WIFI and Bluetooth

Battery: 5-6 hours with regular usage. USB-C supported fast charging.

Audio: Directional Mono Speaker, USB audio and Bluetooth audio.

Robust and Light: Water and Dust resistant. Weighs less than 50 grams.

Difficulties

No visual aid is perfect, and with anything there is a learning curve the user must navigate to become adept enough at using the functions of a device to take full advantage of its capabilities. The Envision glasses are no different, and though the unit has a few idiosyncrasies, none of them are deal breakers in my book.

Waking up

I did find the glasses entering sleep mode quite a bit when using them throughout the day. Users need to be aware of this because it can take quite a few moments for the system to come back online from a sleep. I surmise this is to preserve battery power, and I fully realize that there is a balance between preserving power on the unit and availability to the user. The Envision glasses are not an “instant on” device, and when the glasses are in standby mode there is no audio clue for them powering back on, something I suspect is a limitation of the hardware.  

So sensitive

The primary user interface for the Envision glasses is the swipe pad on the right earpiece. Learning the interface is straightforward and within a couple of minutes it is easy to have all the gestures down to effectively navigate the menu. The swipe pad, however, does not always recognize two-finger gestures, which are the way to access the return to main menu and volume control options. It often took multiple tries to return to the main menu (two finger swipe down), and adjust volume (two finger swipe up). When using the glasses in the read mode, I often found myself inadvertently enabling offline mode while navigating. I believe this is due to the sensitivity of the swipe pad and the location of the offline mode menu option. Not a big deal, but it did take some getting used to.

Standby mode 

The review unit I had encountered a few hiccups when powering up from a sleep or standby mode.  If you do not use the glasses for a bit or fold them up and set them on your desk, they enter sleep mode. There is often a 30 second or longer delay when the glasses power back up to a useable state. There is no sound or user notification when the unit is starting up, and at first, I found myself thinking the unit was powered off, so I held the power button to restart. I assume this is a limitation of the hardware platform the glasses are built upon. In my testing, there were also several times I could not get the glasses to power up and ended up plugging them in which caused a startup. I am not sure if this is a common issue with the glasses or just an issue with the unit I was reviewing.

Standby battery

Standby battery life for the Envision glasses was a bit of an issue. From a full charge before bed, I set the glasses on my desk and returned in the morning with around 59 percent charge. This is a concern when traveling, as I used the glasses to navigate the airport before my flight then put them away until arriving at the next airport and the glasses were just below 50 percent charge.  Not an issue, as I travel with a power brick for my USB devices, and the glasses offer fast charge, but I thought it was worth noting. Your use case and mileage will vary.  

Low light

The Envision glasses have some difficulty reading and finding objects in low light situations. I commonly encountered the glasses giving the not enough light error when trying to read menus in darker restaurants. I am sure this is a limitation of the hardware and perhaps could be improved in future revisions of the product.

Features I would like to see

The quality and accuracy of the Envision glasses is great and having a pair of glasses that are lightweight and easy to use that will read things to a blind or low vision person will be a game changer for many users. However, there is always room for improvement no matter how awesome a product is. After a couple months of testing, here is a short list of features or improvements that would be of great benefit to the platform:

Refine quick text

Quick text tends to read things far away from the user and it can be a little confusing. The unit is just reading what is in front of it, that makes sense. If the glasses had the ability to read something the user is pointing to, it would be a natural way for the user to interact with the glasses. For those who have some vision, pointing at a sign or other text could be an amazing feature. If adding a feature such as this would limit the functionality of quick text in offline mode (when the unit is not connected to the internet) then perhaps another menu option for point read etc.

Games

The Envision glasses do not recognize playing cards as of this writing. When I heard about the Envision glasses initially, I had dreams of playing poker or other card games without a sighted person to aid me. I know this functionality is being worked on at Envision, but I can imagine that teaching the AI algorithm to recognize cards will be a significant task. Just think about how many different card styles there are. Every playing card manufacturer has their own look. Envision could limit the functionality to a few card manufacturers (I smell a collaboration opportunity) and if card recognition could be implemented with the point to read option mentioned above, then low vision players could start enjoying games without braille cards.

Unify identification

Currently, users need to enter the currency menu function to identify bill denominations. It seems as though it should be able to access the data that allows the glasses to identify bills from the explore feature. This also goes for people that have been trained into the glasses system. When exploring a room, people were not identified by name, but rather man or woman, and what they are doing, like sitting in a chair. I hope future revisions of the software can make the explore or identify operations more general. If the user trains the glasses using the train envision feature, that training should roll out to the whole system, as it is a disconnect that the system can identify something in one mode but not in the others.

What do they cost?

Assistive technology can often come with a bit of sticker shock in terms of cost. The Envision glasses recently had a significant price drop to $2499 (USD), and part or all that cost may be covered by insurance depending on your plan. I have reached out to a couple different insurance providers (including my own) about the current percentage being covered and will update this review once I have some definitive numbers to share. For now, contact your provider and ask them if they cover a percentage of purchased visual aids.

Envision also recently announced that their companion app which has many of the features of the glasses is now free to use on iOS and Android. This is great news.

The verdict

As someone who has low vision, I found the Envision glasses very useful, and enjoyed the time I had with them to create this review. The Envision team are committed to improving the product, and the addition of voice commands add another layer of usefulness to an already well thought out user interface. The glasses are stylish, and lightweight with good battery life. Considering the plans the team have for improving the glasses, this seems a good bet for a useful product for anyone with low or no vision.

I’d like to take a moment to thank Envision for sending out a review unit for me to test.

Resources

If you would like more information about the Envision glasses, here is a link to the manufacturer’s website.

https://www.letsenvision.com/envision-glasses             

 

Before you go…

Thank for checking out our Envision glasses visual aid review, I love to hear from my readers, have you tried the Envision glasses, or do you have questions about my time with them? feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photographywww.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/

 


Update on COVID-19 accessible tests

blind travels logo, text and silhouette of guide dog and handler

A couple days ago I posted about free accessible COVID-19 tests which are readable through a smartphone for blind and low vision users. Today they have announced the USPS will send 12 tests in 6 packages instead of 2. Here is more information, I placed my order and they gave me information for five packages since I have already received my first order.

https://acb.org/accessible-COVID-tests-announcement

Link to order directly

https://special.usps.com/testkits/accessible


Blind Travels featured in Accessible Journeys Magazine

A heartfelt thank you  goes out to Accessible Journeys magazine for the wonderful article on Blind Travels in their summer issue. While we focus mainly on blind and low vision accessible travel, the fine folks at Accessible Travel create articles and tips geared toward a variety of disabilities. I encourage you to take a moment and check out their well produced and beautiful looking magazine. 

For blind and low vision viewers, Accessible Journeys features a built in screen reader mode, which allows users to enter a text-heavy version of the magazine which works perfectly with NVDA and Jaws. The screen reader mode also features a built in screen reader. 

Here is a link to the summer edition of the magazine.

https://reader.mediawiremobile.com/M%C3%A9lange%20Magazine%20Publications/issues/207950/viewer?page=1

Before you go…

I love to hear from my readers! Have you checked out the latest issue of Accessible Journey, let me know what you think!  Additionally, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/

 


Accessible COVID-19 tests are now available

blind travels logo, text and silhouette of guide dog and handler

At home COVID tests are great, but until now if you had little or no vision, it was impossible to read the results of the tests.  The test results are read through an app on your smartphone vis Bluetooth. The test is compatible with iPhones and Android smartphones, but not Android tablets. An iPhone needs iOS 12 or later, and an Android phone needs Android 6.0 or later and Bluetooth 4.0 or later to run the tests. The link below has a list of compatible devices from the manufacturer. 

Click to access COVID19-Home-Test-List-of-Compatible-Devices_Final-B.pdf

According to the manufacturer, complete audio instructions are provided in the app to aid in connecting to and obtaining the test results for the test units. I have ordered mine, and will report back with more information. 

You can sign up and order your tests from the USPS website at the link below, or by calling: 800-232-0233. Each order is for one package of two accessible tests, and there’s a limit of one order per address. 

https://special.usps.com/testkits/accessible

If you would like more information about the program or the tests, here is an article from seniorsmatter.com

https://www.seniorsmatter.com/free-accessible-covid-19-tests-now-available-for-visually-impaired-people/2613400/

Before you go…

I love to hear from my readers! Have you tried the new accessible rapid COVID-19 tests? What do you think? Additionally, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/

 


Complaint alleges DraftKings website is not accessible to blind users

blind travels logo, text and silhouette of guide dog and handler

Robert Jahoda recently filed a lawsuit alleging the DraftKings  online gambling website is not completely compatible with screen readers. While this is a commonly seen lawsuit, it is not surprising. Unfortunately, the needs of blind and visually impaired users when it comes to accessing online content are often overlooked when companies develop websites. Even worse, I personally know web developers who think that they can close their eyes and gain a true insight into the way a blind or visually impaired person surfs the web. What are companies to do with lawsuits like these? are they helpful to online commerce in the long run?

Its the law

Users with disabilities have the same right to use a website or technology as anyone else. A company must make their services available to everyone under the Americans With Disabilities Act signed into law in 1990. For the ADA National Network brief on accessible digital technology:

One of the main issues concerning the ADA and web accessibility is what constitutes “public accommodations.” Title III of the ADA provides regulations for private businesses and other entities to ensure access for people with disabilities within the realm of public accommodations, described as “businesses that are generally open to the public.”2 This phrasing has become central to the understanding of web accessibility within the U.S. Two notable court cases related to this issue include Robles vs. Domino’s Pizza LLC and National Association of the Deaf et al. vs. Netflix, Inc. Both of these cases were ruled in favor of the plaintiffs (i.e. people with disabilities). The rulings detail that companies must provide accessible features in online applications and web-based businesses. Despite the number of web accessibility cases, the Department of Justice has withdrawn potential rule changes to the ADA to provide further regulations on digital access. Businesses can utilize other standards, such as the Web Content Accessibility Guidelines (WCAG),3 in order to meet the spirit of the ADA and provide accessible technology to the public.

Under the ADA, a visually impaired person wishing to access digital content has every right to do so. This means that companies who do not have accessible websites must provide them. Until the standards for accessible information delivery are solidified, class action suits such as the one against DraftKings are going to continue. My suggestion to any company facing this sort of litigation not see it as a negative or an inconvenience, but rather as a potential positive impact on your business. 

Big business

According to the CDC, nearly 7 million people in the United States are blind or visually impaired. A company who does not provide accessible web content is alienating these potential customers. Being visually impaired myself, and a screen reader user I appreciate and often offer my loyalty to websites which are fully accessible. Conversely, websites which employ tactics to keep visitors on their page longer by placing popup ads with the close button located in strange or hard to find locations will generally not see my patronage again. 

Big bucks

Companies like DraftKings which can afford to be a primary advertiser for the Super Bowl, and has the advertising budget to be in every television break for the Stanley Cup playoffs can certainly afford to have a person on staff who can properly review their website infrastructure for proper accessibility.  Why not take a different tactic when resolving this issue and employ a visually impaired person to help with this, and give your company a positive PR spin with the blind and visually impaired community? Increasing the number of people who can use your site can only benefit your business, and offset any development costs incurred while implementing accessibility features. 

This sentiment extends to more than DraftKings, any corporation providing content to the public should be aware of the needs of it’s disabled customers, and be proactive about meeting those needs before class action suits start heading your way. 

Resources for this article

If you would like to read more about the class action suit, follow the link below. 

DraftKings complaint alleges website inaccessible to visually impaired, blind

If you would like more information about the ADA Title III and Digital Access, follow the link below.

https://adata.org/research_brief/digital-access-and-title-iii-ada

If you would like more information and statistics about the number of Americans with low vision and blindness, follow the link below

https://www.cdc.gov/visionhealth/risk/burden.htm

Before you go…

I love to hear from my readers! What do you think? Are these lawsuits frivolous or are they of benefit to the blind and low vision community?  Additionally, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/


The difference between emotional support animal and service dog

blind travels logo, text and silhouette of guide dog and handler
Do you know the difference between an emotional support animal and a service dog? It can be confusing, especially with the new legislation, and all of the claims by companies on the internet promising documentation to allow you to take your dog anywhere you go. I am a guide dog user and the people who mistakenly believe they can take a household pet anywhere they go because they bought an official looking ID card online often makes it more difficult for those who use well trained legitimate service animals. It is a bit sad when I go into a restaurant and park my guide under the table, only to have the hostess come over and say they have never seen a well behaved service animal before. With some questioning, it often becomes clear that the general clientele who bring in dogs masquerading as service animals bark and cause a commotion while they are in the restaurant. This is sad and I do what I can to educate them on what a legitimate service animal should behave like. 
You can learn lots more here:
https://www.baywoof.org/ask-dr-dog/ogel0jyo5pikvw8t0qmuibjg6q8ksd?fbclid=IwAR25g0UrF50HkZz0gbK6x4lKyVUYukU-5LQG2COSdSe3oU4KGC_o2PBSpbM

Don’t pet the fluffy cows!

A majestic looking buffalo in a field of brown grass with the rocky mountains behind him.
A reminder to keep an eye on your travel companions this summer. The woman in this article intentionally got too close to a Bison while visiting Yellowstone National Park. The animals in our national parks are not domesticated (in almost every case) and have little to no tolerance for tourists doing dumb things. It feels like every year we have to put out reminders like this because someone got too close to an elk during rutting season or wandered up to a bison for a selfie.
I feel like it is the responsibility of the blind or visually impaired person in the group to point out that someone is about to do something silly. Since your travel companions are usually narrating the happenings, you can take it upon yourself to reiterate the things that are going on… What do you mean Stacey is walking up to a Bison for a selfie??!?
Let’s have a safe and sane summer vacation season!

Meow Wolf Denver Audio Described Tour

Ted and his black lab guide dog Fauna inspect a wall-mounted sculpture. Ted is inspecting the sculpture via touch, he wears glasses and has a ponytail in his hair.

Meow Wolf is a Santa-Fe based art and entertainment company which creates immersive experiences designed to transport guests out of their normal reality into a fantastical environment packed with visual, tactile, and auditory delights. The first Meow Wolf is located in Santa Fe and opened in 2016. Since that time, they have expanded into two additional locations, Las Vegas, and Denver.     

As a visually impaired person, I love art of all types, especially immersive art with a strong tactile and auditory component. I was very excited to learn that Meow Wolf is now offering Audio Described tours, which depart the Convergence Station lobby at 10AM on the second Saturday of the month. There is no additional cost for the Audio Described Tour, guests just need to schedule and confirm their time through the Meow Wolf website (more on that later).

Recently, I was offered an opportunity to join one of the Highlights of the Convergence Audio Description Tours, so I loaded up my wife and guide dog Fauna anticipating a great multi-sensory immersive art experience. I had not been to Meow Wolf before this tour, so I was really looking forward to it.

The Venue

The Denver Meow Wolf is located off I-25 near Empower Field at Mile High. Parking is limited, especially during football season, so plan your tour with plenty of time in mind. Meow Wolf boasts five stories with 90,000 square feet of space, 2/3 of that being dedicated to immersive art, and the remainder houses offices, a café, bar and an 800-person music venue.

My Experience

We arrived about half an hour before our tour time, queued up in the security line, and headed in. Upon scanning our tickets, we were directed into the venue and escorted to the waiting area where the audio described tour would meet. We were handed a single-ear headphone to wear during the tour, which did certainly help in the noisier environments of the venue. Members of the Meow Wolf QDOT Tour Team gave clear directions about the tour and did a great job of slipping into character as a Quantum tour guide showing travelers (guests) around the Convergence Station.

The two-hour tour was a highlight tour, which touched on the lore, and overall story arc of visiting a new dimension for each area located in the Convergence Station. As we progressed along the route, the QDOT guides took turns relaying information and fun facts about “their area”, which added a nice level of personability to the tour and helped to bring the story to life.  We wandered to and through each area of the venue and were treated to a variety of opportunities for tactile interaction. The guides were well-rehearsed and did a commendable job of delivering interesting tidbits, including well thought out descriptions of the art that even the fully sighted may miss.

As expected with a guided tour aimed at the visually impaired, one of the QDOT Tour Team was dedicated to wayfinding, ensuring that the travelers (guests) on the tour were directed to each location with clear and concise instructions. My guide dog is not easily distracted by sights and sounds so had no issue navigating from exhibit to exhibit. The entire team were very good about ensuring that travelers knew where handrails were located as we journeyed through the Convergence Station, as well as the objects around them at any given time.  They deftly pointed out interesting textures that were available and were well versed in offering to assist in helping guests find the features they were describing as the tour progressed.   

A mid-tour intermission was offered, and the QDOT Tour Team were more than willing to lead travelers to the restroom facilities if needed.

The environments

Around each corner in Meow Wolf Convergence Station, it felt like entering a new dimension with unique visuals, sounds, and textures to explore. Each environment is separated by thick vinyl curtains, doors, or elevators, and the tour team were very good about holding the obstructions open and ensuring the whole group was through before moving on.

Each new environment featured interesting ground and wall textures to experience, environmental sounds, and a myriad of cool lights. It should be noted here that if guests are sensitive to bright lights or (in some areas) loud-ish sounds, that Meow Wolf offer a variety of accommodations to help with these kinds of sensitivities – all one needs to do is ask at the front desk when entering the venue. Though there were a couple parts of the tour with some louder sounds, they did not bother my guide dog, and she was not distracted by the bright lights in the darkened environment – your mileage may vary of course.

At no time did I feel like I was unsteady while traversing the exhibits. Guests should be aware there were a few areas where the wall textures extended to the ground and could pose a tripping hazard, the tour guides considerately pointed these out, but this is a consideration if you are visiting by yourself. Walkways, hallways, and other areas seemed adequately wide enough for those who may be mobility impaired or wheelchair users. The only place that did not have level ground or ramps were the explorable vehicles, some were a bit higher than a standard sedan and there was a bus that we entered which was not wheelchair accessible – there were however many interesting things outside of the vehicles to explore and enjoy.

More to see

At just around two hours, the Highlights of the Convergence tour’s time length was right on the money. It did not feel like it dragged in any way, and it was made very clear that Meow Wolf had a lot more to offer. The tour did a great job of conveying the background and interesting facts about each environment and inspired the guests to return for further investigation. Once the tour was completed, we happily wandered back into the exhibits to explore some of the smaller areas, which is easier to do with a smaller group. There are so many doors, and behind many are cool small areas to explore.    

Be sure to bring your smartphone, there are QR codes to scan which bring you to an interesting internet location and is just another way to explore this fun and immersive environment.

My favorite part

There were a couple of exhibits which really stuck with me. First was a room where the artist was a wheelchair user, and had designed the room with accessibility in mind, I found it interesting to hear the other guests discussing that they were unaware of what was required to make a kitchen environment accessible to a mobility impaired person.

I also very much liked the exhibit that was centered around memory. The team explained to us that the artist had a family member going through different stages of Alzheimer’s and the lighted and textured sculptures in the exhibit represented neurons in the brain. The installation featured monitors in housings made to look like old-time televisions displaying memories of the artist and their afflicted family member. This fit well into the cave of memories environment especially when the QDOT Tour Team explained that the lore for the Convergence Station was that memories were used as currency. It was interesting to follow the thought process that the person with Alzheimer’s would be becoming poorer as they lost their memories to that terrible disease. 

Accessibility

It is refreshing to see a company that is “all in” on accessibility for their guests. Clearly, Meow Wolf has spent ample time considering accessibility for most guests requiring accommodations. If you have questions about accessibility, please check out Meow Wolf’s Accessibility Page at:

https://meowwolf.com/visit/denver/accessibility

It should also be mentioned that Meow Wolf has a no-pet policy and does not allow Emotional Support Animals, Comfort Animals or Therapy Animals. Having taken the tour, and had my guide dog leading me, I am not comfortable in saying that service animals without a high level of training would perform to the best of their abilities in the immersive and stimulating environment Meow Wolf offers.

If this article has inspired you, and you would like to schedule a spot for an upcoming Highlights of the Convergence tour, you can fill out the form on this page at the link below and the Meow Wolf staff will walk you through the necessary steps. There is no additional cost to attend the tour, it is included with the price of regular admission, you only need to schedule the tour time with the staff.

Meow Wolf page to sign up for Audio Description Tour

https://meowwolf.com/visit/denver/audio-description-tour

Final thoughts

The exhibits in Convergence Station ranged from fun and light to deep and thought-provoking in nature. Some pieces were intimate and quiet, inviting introspective contemplation and others were loud, raucous, and bright. Travelers never knew what was coming next, and that was part of the fun. Along the way, the QDOT Tour Team enhanced the experience with descriptions that were just the right length. I highly recommend this tour for anyone who is low vision or visually impaired, and I certainly plan to return for some solo exploration of the exhibits.

Before you go…

I love to hear from my readers, have you attended the Highlights of the Convergence tour? What did you think? Additionally, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/

 

Resources

Main Meow Wolf home page

https://meowwolf.com/

 

<


New glasses incorporate AI to help the visually impaired

Image featuring the blind trtavels logo, and the Envision glasses

Anyone who is blind or visually impaired can instantly list the everyday tasks that cause them frustration. Reading a menu at a restaurant, determining who is in a room when you enter, finding an open seat when boarding a bus or train, and telling the difference between various denominations of money just to name a few. What if there was a new product available that could do all of those things and more? The new glasses from Envision incorporate AI and a smartphone app to allow the low vision user  new way to experience their environment.  

Built for expandability

Envision glasses utilize Google glass enterprise edition 2, and AI to scan, read and OCR text in over 60 languages. In videos demonstrating the technology, the user activates a function on the glasses to take a picture of the text, which is then read out loud. This is good for paragraphs of text, but the glasses can also read small amounts of text in near real-time. The glasses and companion smartphone app are being developed like a platform allowing for quick updates and addition of features. So, other than reading text, what else can the Envision glasses do?

Basic features

The glasses offer scene description, including object detection, color and light detection and face recognition. The feature set of the glasses seem to tick many of the boxes that visually impaired users are looking for in this type of product.  The combination of the AI powered glasses and smartphone app seem to incorporate functions from several apps for low vision into one platform, with the promise of further development and feature addition.  

Call for help

The glasses can also handle video calling, allowing users to contact a trusted party to see what the user is seeing from their perspective and allowing them to offer assistance in real time. This can allow the user to navigate difficult situations which are outside of the functionality of the Envision glasses feature set. 

Tech Specs

From the Envision website:

  • Camera: An 8-MP camera with a wide field-of-view.
  • WIFI and Bluetooth
  • Battery: 5-6 hours with regular usage. USB-C supported fast charging.
  • Audio: Directional Mono Speaker, USB audio and Bluetooth audio.
  • Robust and Light: Water and Dust resistant. Weighs less than 50 grams.

What do they cost?

Assistive technology can often come with a bit of sticker shock in terms of cost. The Envision glasses cost $3500, and part or all of that cost may be covered by insurance depending on your plan. 

Thoughts

From the preview articles and videos I have reviewed, the new Envision glasses appear to have the features needed for daily use in a work or other environment. With a 5-6 hour battery life with regular use, and fast charging, users should be fine to get through a day, especially if they bring a battery pack just in case. The cost does not seem out of line for a specialty product, given that users may be able to offset that initial cost through insurance or other agency. For me personally, being able to walk in a room and have the scene described to me including who was there and the objects in the room would be a big benefit. In terms of travel, I can see the Envision glasses being invaluable when traveling to new destinations. I can see the usefulness for this product when finding your gate at the airport or train station, and being able to eventually use the self check-in kiosks unassisted. 

At the request of many of my readers here and on social media, I have reached out to Envision to see if I can procure a review unit so I can create a full breakdown of the functionality of the unit for you all.   

Resources

If you would like more information about the Envision glasses, here is a link to the manufacturer’s website, and a link to a California based news channel with a video preview of the glasses in action. 

https://www.letsenvision.com/envision-glasses

News Story

https://www.kpvi.com/interests/ca-high-tech-glasses-help-visually-impaired/video_822bf324-aaa4-50c9-82f0-624c338b081e.html

Before you go…

I love to hear from my readers, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/


Traceability codes for guide dogs get their debut at the 2022 Paralympic Games

blind travels logo, text and silhouette of guide dog and handler

At the 2022 Beijing Winter Paralympic Games, 68 guide dogs were selected as special volunteers to help the athletes. These dogs all have a new identification QR code, which, when scanned provide important details about the guide dog. The new  identification system is integrated into China’s product quality traceability system, which is used for tracking many kinds of products and was created using big data, cloud computing and blockchain technologies. With these technologies, the new identification system makes the traceability code the dogs wear immune to counterfeit.   

U.S. Implications

As a guide dog user, I have long wished for a national system for identifying legitimate service animals. A program such as this would allow hotel and travel providers to scan the unique code for a service animal and determine if it is legitimate, it’s vaccine status and more.  Implementation of a system such as this would take care of one large problem with fraudulent service animal use in the travel and hospitality industry. At present, those who can’t bear to leave their dog at home, can easily obtain fraudulent identification cards which are easily purchased from a myriad of internet companies. The result of this is often misbehaving or nuisance animals masquerading as service animals do little more than give legitimate service animals a bad name in the eyes of the travel and hospitality industry workers.  I’m not sure if this kind of system would ever be implemented into the United States because of the fears surrounding government tracking and overreach. I for one would welcome a national registry, and it could start as simply as each service animal user being required to register the animal and obtain an identification card. 

Tracking

During the early days of COVID, China implemented an identification and tracing system which tracked the user’s vaccination status, and helped to trace infections and exposures of the virus. I remember so many news stories about people being upset with the system and hated the fact that the scanning of the QR code would give a detailed profile of the user’s movements around the country, and those they interacted with. With the divisiveness of our country today, I can’t even imagine a similar system being implemented for any cause, even the identification of service animals. 

False Identification

A nationwide system to identify legitimate service animals of some sort would be a great benefit for service animal users, especially if there was a way to make it immune to counterfeit. There have long been talks about a national registry or service animal identification system. This would be a good first step, though I fear there will still be internet doctors willing to prescribe service animals to anyone willing to pay.  

Training

The second head of the fraudulent service animal monster is lack of training. Those who obtain false identification for their dog so they can bring them on flights and into hotels often own the animals who are the least trained, or afraid of people and traveling. They put their poor pets into a situation where they are prone to bite, bark and be a nuisance to those around them. I don’t think the uninformed understand how much training, and socialization a guide dog goes through before it even is considered for the guide dog program. I cannot tell you how many times I have been exiting a restaurant and the hostess or wait staff will tell me they didn’t even realize my guide dog was under the table. They often relay a story of someone bringing in a small biting-prone dog which sits under the table and barks at everything that goes by the table. They relay that it is a true treat to see an animal with impeccable training in their establishment. 

Knowledge is key

Everywhere I go I love to take time to talk with and educate the public about the role my guide dog plays and what she has had to accomplish to become part of a guide dog team. It is through education legitimate service animal users can start the process of changing the mindset of service industry workers away from the eye rolling and bemoaning the fact that someone is bringing an ill-trained faux service animals into their establishment and toward seeing the legitimate service animal as a valuable asset to the handler. 

What do you think?

Do you believe people should be able to purchase vests and identification cards for faux service animals? What are your ideas for a national registry for legitimate service animals? I’d love to hear your opinion on this, and any ideas about how we can move our government forward with some sort of program – it doesn’t have to be all encompassing, it can be a first step.  

More information

If you would like more information about the identification system China put in place during the early days of COVID, you can read this article, 

https://correspondent.afp.com/new-codes-governing-everyday-life-china

If you would like to see a short video about the guide dogs that were used in the Beijing 2022 Paralympics, and learn about the new identification system, you can go here.

https://news.cgtn.com/news/2022-03-14/Traceability-codes-help-guide-dogs-provide-efficient-services–18oGdiYlBdK/index.html

Before you go…

I love to hear from my readers, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/


© 2024: Blind Travels | Travel Theme by: D5 Creation | Powered by: WordPress
Skip to content