Hyatt Regency Tamaya Resort and Spa New Mexico

The Hyatt Regency Tamaya resort and spa view from our room at sunset. The trees are in fall bloom and the mountain had a few clouds above it

Have you checked out our reviews section lately? Today’s new offering is a review of the accessibility at the Tamaya Resort. 

Recently, my guide dog Fauna and I journeyed to the
Hyatt Regency Tamaya Resort and Spa in New Mexico. Let’s talk about the #accessibility at this unique destination desert retreat.
#traveltwitter #travelbloggers #disabled #blindtwitter
https://www.blindtravels.com/accessible-excellence-a-review-of-the-hyatt-regency-tamaya-resort-and-spa/

“Ted’s journey into the landscape of the human body is a marvelous celebration of all that is physical, sensual and diverse
” – FSTOPPERS

About the author

Ted Tahquechi is a blind photographer, travel influencer, disability advocate and photo educator based in Denver, Colorado. You can see more of Ted’s work at www.tahquechi.com

Ted operates Blind Travels, a travel blog designed specifically to empower blind and visually impaired travelers. https://www.blindtravels.com/

Ted’s body-positive Landscapes of the Body project has been shown all over the world, learn more about this intriguing collection of photographic work at: https://www.bodyscapes.photography/

 Questions or comments? Feel free to email Ted at: nedskee@tahquechi.com 

Instagram: @nedskee

Twitter: @nedskee



Lancaster Museums enable Visually Impaired People to appreciate arts collection through 3D Printed Versions of Paintings

blind travels logo, text and silhouette of guide dog and handler

Making art accessible


3d printing has come a long way in a very short time. I love that museums and other art institutes are utilizing this technology to make art of all kinds accessible to the visually impaired. From the article:

Lancaster district’s art collection will soon be brought to life for people with sight loss thanks to Lancaster City Museums, Lancaster University and Galloway’s Society for the Blind in Morecambe.

I’m also glad that they are looking into making the art even more accessible by including audio descriptions.  I’m a big proponent for audio descriptions in art installations, especially if it is the artist themselves who are creating the audio files. The viewers of the art can appreciate the passion the artist has for their work through their words.  These are exciting times for those with visual impairments, we are finally able to take part in appreciating the visual arts.

I hope to see more art institutes using 3d printers for this sort of work. When I launched my Landscapes of the Body project, it was before 3d printers were really “a thing”  and having prints made of my work was oppressively expensive. I took a different route working with a local printmaker to create a tactile version of the visual prints. I feel offering tactile versions of the art is important so everyone can enjoy the art on their terms.

If you would like to read more about the Lancaster City Museums project you can head to this link. 

What do you think? are art institutes doing enough to make their art accessible for the visually impaired community? Let’s talk about it on my social media links below. 

 

“Ted’s journey into the landscape of the human body is a marvelous celebration of all that is physical, sensual and diverse
” – FSTOPPERS

About the author

Ted Tahquechi is a blind photographer, travel influencer, disability advocate and photo educator based in Denver, Colorado. You can see more of Ted’s work at www.tahquechi.com

Ted operates Blind Travels, a travel blog designed specifically to empower blind and visually impaired travelers. https://www.blindtravels.com/

Ted’s body-positive Landscapes of the Body project has been shown all over the world, learn more about this intriguing collection of photographic work at: https://www.bodyscapes.photography/

 Questions or comments? Feel free to email Ted at: nedskee@tahquechi.com 

Instagram: @nedskee

Twitter: @nedskee


Blog Body


Voting and accessibility (it’s that time again)

a series of I voted stickers randomly strewn about on a table

It’s time again for United States citizens to begin pondering their stance on the upcoming midterm elections. While we won’t be deciding on the president, we will be electing many local and state governing representatives. Since I launched Blind Travels, I have always made it a point to report on the importance of voting in every election. It may be easy to dismiss the midterm elections as something which is not as important as the big presidential election every four years, however these electoral cycles can directly affect you more than you think.

 Your voice matters

I always spend a lot of my time staying out of the lane of politics. There have never been any political pieces here on the blog, I feel it important to maintain a neutral heading when it comes to the crazy world of politics. However, regardless of the reasons for your interest in this article, one point stands head and shoulders above the rest regardless of your political affiliation, and that is your voice and your vote matter. One may never know when an elected official will propose or enact a change that adversely affects you, this is especially true for blind and low vision individuals.   

Do your research

Regardless of the side of the aisle you choose to vote on, it is always worth you time as a visually impaired person to research new accessibility and accommodations for blind voters. Now is the perfect time for this, just type in accessibility for blind or visually impaired voters and your state into Google. Here in Colorado, I learned that SB21-188, Ballot Access For Voters With Disabilities allows a voter with a disability to use an electronic voting device that produces a paper record to vote in a mail ballot election. This is great and all, but if I can’t read the screen to make my choice, a piece of paper I can’t read isn’t going to do much more for me. Considering the needs of disabled voters is a step in the right direction, in terms of accessibility, but still does not resolve the biggest issue that most of us with vision loss have with voting and that is being able to read the ballot options.

Late homework

Every two years during midterms, there are a ton of articles posted about making the voting system more accessible for vision impaired users, and every four years with the higher profile presidential elections, these cries seem to intensify. For many of us, we enter the polling station, and someone there is required to read the choices to us and enter our desired option.  There is a lot of trust imparted on this way of doing things, the person voting relies on the one reading and marking to honor their choice. Don’t get me wrong, I am not saying that people working at the polling stations are unreliable or unethical, I am only saying that the voting system needs to be overhauled so that people with vision impairments can read and choose for themselves. Given the current political climate I strongly believe that it is safer for all to be able to make the choices in private and not have to endure comments because of my choice of candidate.

It is too late (again) for municipalities and states to make changes to their voting system, so can’t we all get started on fixing this system, and making it accessible for all voters before the 2024 election?

Conclusion

How is voting handled in your state? Do you ever find yourself in a situation where you feel uncomfortable when voting? Let’s talk about it! Get in touch with me on my social media links below!

“Ted’s journey into the landscape of the human body is a marvelous celebration of all that is physical, sensual and diverse
” – FSTOPPERS

About the author

Ted Tahquechi is a blind photographer, travel influencer, disability advocate and photo educator based in Denver, Colorado. You can see more of Ted’s work at www.tahquechi.com

Ted operates Blind Travels, a travel blog designed specifically to empower blind and visually impaired travelers. https://www.blindtravels.com/

Ted’s body-positive Landscapes of the Body project has been shown all over the world, learn more about this intriguing collection of photographic work at: https://www.bodyscapes.photography/

 Questions or comments? Feel free to email Ted at: nedskee@tahquechi.com 

Instagram: @nedskee

Twitter: @nedskee


Blog credits


A camera with a screen reader Sony A7RIV

Sony A6rIV camera body with no lens and the photo sensor exposed

There are a lot of visually impaired people who use still cameras as an important tool to capture a scene (like birthday party, or other special event) and see it later on a larger computer monitor. The inherent problem with this solution is that until recently, there were no DSLR cameras available with accessible menus. In late 2021, Sony launched their Sony Alpha 7 IV (A7R IV) camera which includes a first of it’s kind screen reader. 

Jumping ship

Now, this is not one of those attention-grabbing articles that inevitably every photography blogger writes. Camera body features evolve over time, and there are always YouTube photography influencers who make a big deal about jumping ship from one manufacturer to another. Reasons vary but mainly focus on a new feature set, or megapixel increase. These influencers end up selling their entire kit and wave the flag of the new manufacturer only to return at a later date because they always loved the previous manufacturer.    

I have been shooting professionally since 2000 and been a Canon user since that time. I’m almost completely blind and I have stuck with them because I am comfortable with the menu layout and can navigate it quickly. My belief is that a photographer should invest in good lenses and body features will evolve over time. With good glass in hand, lenses can be adapted to new technology as in the transition to mirrorless and new lens mount for Canon.

I have never been one of those photographers who considered jumping ship from one camera manufacturer to another because of a single feature, until now. Having a screen reader available in the camera menu system could be a game changer for anyone who is visually impaired and is a photographer.  

Market Need

After a couple weeks of debating this point with many photographer colleagues, the consensus from the perspective of working photographers seems to be that a screen reader is not a needed function on a pro level camera because there are few visually impaired users who would consider purchasing a $3000+ camera setup. I disagree with this mentality, my experience is that when manufacturers are required or volunteer to implement accessibility features into any product, that product tends to increase in useability in unsuspecting ways for a larger audience than just those who need accessibility features. In terms of this camera, this not only helps the users who are low vision or visually impaired, but it forces those who are creating the menus and functionality of the product to take a long hard look at what and how they are implementing features (and I know this because I come from a long product manufacturing background).  Just like beeping signals tell the visually impaired when it is safe to cross the street, the addition of screen reader technology could give Sony the opportunity to reflect on their menu layout and could in effect increase the quality of the user experience for all their users. I applaud Sony for implementing this feature even in its limited state (more on that later) and hope that other camera manufacturers follow suit in Sony’s groundbreaking addition of this feature.  

The A7R IV

The A7R IV is a popular choice for professional level/enthusiast level photographers, sporting a 61-megapixel sensor, 4K 60p video recording and 10 frames per second burst. This camera is a great choice for still and video enthusiasts alike.

Early days

Screen reader technology has been available for years and is commonly used in everything from computers to smartphones and ATM machines. Sony (as of this writing) is the first camera manufacturer to include screen reader technology in a pro level camera. I have not been able to get hold of one of these cameras to test it myself, but it appears as though the initial implementation of the feature is a bit limited in the menus it will read. Users can adjust the speed of the voice up to 4X which is great for screen reader power users. Users also have the ability to adjust the volume of the screen reader voice.

Final Thoughts

As a professional landscape and travel photographer, I have never been tempted to trade in all of my gear and purchase a new camera system, but if Sony are not only leading the way to accessible photographic technology but committed to adding new capabilities for this screen reader technology, this could be my reason to switch.  Thanks to the readers who alerted me to this technology, I will reach out to Sony and see if I can get a review unit so I can create a feature article about using this camera in the field.  

If you would like to read more about Sony’s screen reader technology, here is an article I found from digitalcameraworld.com

https://www.digitalcameraworld.com/news/screen-reader-feature-on-sony-a7r-iv-is-welcome-news-for-the-visually-impaired

Before you go…

I love to hear from my readers, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

“Ted’s journey into the landscape of the human body is a marvelous celebration of all that is physical, sensual and diverse
” – FSTOPPERS

About the author

Ted Tahquechi is a blind photographer, travel influencer, disability advocate and photo educator based in Denver, Colorado. You can see more of Ted’s work at www.tahquechi.com

Ted operates Blind Travels, a travel blog designed specifically to empower blind and visually impaired travelers. https://www.blindtravels.com/

Ted’s body-positive Landscapes of the Body project has been shown all over the world, learn more about this intriguing collection of photographic work at: https://www.bodyscapes.photography/

 Questions or comments? Feel free to email Ted at: nedskee@tahquechi.com 

Instagram: @nedskee

Twitter: @nedskee


Blog credits


Blind and visually impaired guests at Disneyland Paris will now have the ability to access audio descriptions content via the AudioSpot mobile app to enhance their park-going experience.

blind travels logo, text and silhouette of guide dog and handler

Blind and visually impaired guests at Disneyland Paris will now have the ability to access audio descriptions content via the AudioSpot mobile app to enhance their park-going experience.

This is really great news, I hope we continue to see more of this sort of thing. Embracing the needs of the visually impaired traveler at your destination only increases the likelihood of return visits. 

AudioSpot app at Disneyland Paris increases accessibility for the visually impaired


Ellume COVID-19 Home Test Accessibility Review

Recently I reported on the availability of COVID-19 tests that advertised themselves to be accessible for blind and visually impaired users. The tests are available at no cost from the US Post Office and come in six packs of two each (I’ll include a link to the ordering information later). According to the press release about the tests, full audio instructions were included in the app. After returning home from a trip, I received an exposure notification which advised quarantining if I was not vaccinated and to test if I began exhibiting symptoms of COVID. Since I got back, I have been feeling blah, and noticed elevated temperatures at night, which was one of the first symptoms I showed when I contracted COVID last year. I thought this was an opportune time to give these accessible home tests a whirl and report to all of you. Were they easy to use, and perhaps more importantly were they accessible for someone who is blind or partially sighted? Let’s talk about it.

The Test

The Ellume COVID-19 home tests come with a few packages, a bottle of testing fluid, and some product information leaflets. None of the packaging is labeled in braille or described in the audio overview instructions.

The Analyzer

The long package contains the Bluetooth connected analyzer which has a power/connect combo button and a receptacle for the testing fluid to be placed in (more on that later).

The dropper

The dropper is the second package and contains an elongated round tube which the testing fluid is squeezed into during the test process.

The nasal swab

The nasal swab comes in the blister pack. The swab is long, and the base of the swab has a flip top lid, and a plastic spacer for children which needs to be removed if the test is going to be used on an adult.

The testing liquid

The testing liquid comes in a very small tube with a plastic lid that is removed by twisting.  

The Process

The quick start guide goes through each piece of the included materials and then suggests searching your app market for the Ellume app. The analyzer connects via Bluetooth and only works with later generation smartphones. A compatibility list is available, but if your phone is not more than a few years old you should be fine.

The first thing you are asked to do is squeeze the contents of the testing fluid tube into the dropper. The tip of the tube twisted off and I emptied the contents into the dropper easily. Next you are instructed to connect the analyzer to your phone by turning it on and ensuring the green light is on. This is difficult for a person with no vision, as there is no auditory clue of the analyzer being powered on from the analyzer or the companion app. The next step is to hold the power button on the analyzer until the green light starts flashing rapidly – again with no audio cue from the analyzer or the companion app to let the user know that the unit had been successfully connected.

Removal of the plastic spacer on the nasal swab is next then the user is instructed to place the swab into the nose and rotate three times. This procedure is repeated for the other nostril. After swabbing, the unit with the swab is screwed tightly onto the dropper and the lid is flipped open and turned upside down. The user needs to then place five drops of the liquid into the receptacle of the analyzer unit. There was no way I was going to be able to do this part without a sighted person to either do it for me or guide me in the process.

Once the liquid is in the analyzer, a button click on the app starts the 15-minute timer to await the results. There is no audio cue from the app when the timer starts, or when the timer is complete. The user is advised not to close the app, so I went the extra step of setting my phone not to auto lock, to ensure it would stay on during the test reading phase.  When testing is complete there is no audio cue for the user to know and no audio announcement of the result. There is a learn more button at the bottom, and below that is a share results button which could easily be mistakenly pressed. I didn’t want to share my results, so I clicked learn more – users should be aware that the share button would be easy to press inadvertently.

Results

My test result was negative, but it was an eye opener that this test is not as accessible as advertised. Downloading the app was easy and putting the testing liquid into the dropper was easy, but not being able to see the light on the analyzer and no audio cue from the app to guide the process made things difficult. The requirement of dropping five drops of liquid into a small hole on the analyzer is also a non-starter in terms of a totally blind person being able to do this test alone.

Reporting

My result was negative, but I was informed when setting up the app that my personal information like date of birth, location and test result may be shared with the CDC, making this test not anonymous. If that is a concern for you, now you know.  I don’t mind sharing my results, especially considering I am sharing it publicly with all of you today.

Thoughts

I’m not sure if the test in its present form really qualifies in my mind as accessible. It is cool that the thing connects via Bluetooth, and gives the results, but the process to get the results needs to be reconsidered before it is really accessible without someone who is sighted to assist.

Things could be improved, in the app, a button which would give the user audio cues for completing the current step would be of great benefit and minimize the impact to fully sighted users. An audio cue for powering on the analyzer would be great but I doubt that could be completed without a redesign of the unit. Adding an audio cue for connecting the app to the phone should be easy and would be a good indication of the analyzer being ready to accept the testing fluid.  

Asking a blind or nearly blind person to squeeze five drops of liquid into a tiny hole is not a good plan. To minimize the possibility of user error, the user should be able to turn the dropper over and squeeze all of the fluid into the reservoir of the analyzer unit. The team had the right idea for the swab and dropper, the user swabs then screw the swab onto the dropper. The dropper should screw onto the analyzer in some way which would minimize the likelihood of the user missing the receptacle when squeezing in the fluid.

Lastly, the app or the accompanying materials should explain which piece is which, as I did above, especially for users with no sight. This test is being advertised as accessible it should be accessible to those who are partially sighted and blind.  

Conclusion

This test is certainly a step in the right direction for allowing those who are blind to home test accurately, however it appears as though the team who created the system do not employ partially sighted people or did little testing with focus groups who are blind or partially sighted.  I’m always willing to put my time where my mouth is and if the Ellume team or any other team want my feedback on accessibility of materials such as this, I am more than willing to help – all you have to do is ask.

Ordering Information 

Link for the announcement about the COVID tests

https://acb.org/accessible-COVID-tests-announcement

Link to order directly

https://special.usps.com/testkits/accessible

Before you go…

Thank you for checking out our Ellume COVID-19 home test review, I love to hear from my readers, have you tried the Ellume test? What was your experience? If you have questions about my experience with this test, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/


Envision glasses visual aid review

blind travels logo and envision glasses product image

We here at Blind Travels have been following the progress of the Envision glasses visual aid for a couple of years now. Recently, we reported on the availability of this new product and some of its key features. The glasses provide users with low or no vision the ability to read text, identify people and objects, and even quickly determine the denominations of currency. Interest in the Envision glasses was growing even during their early development, when they were awarded the Google Play award for accessibility in 2019. We reached out to Envision and they sent us a production unit to test. Does this new visual aid live up to the hype? Let’s talk about it…

Overview

The Envision glasses are an electronic visual aid created by Dutch tech company Envision (https://www.letsenvision.com/). The glasses offer blind and low vision users the ability to read text, identify currency, describe a scene, find objects and people, detect light and color and more. The glasses require a companion smartphone app which is available for iOS and Android. The glasses are user friendly, responsive, and offer excellent text recognition accuracy. Let’s take a deep dive into the look and feature set of this innovative visual aid.

The look

The Envision glasses are built on the Google Glass version 2 platform, and while there was a lot of hype and media coverage around Google glass back in the day, most people did not recognize them when I was wearing them in public. The general reaction I got to the glasses was they look futuristic and cool. I found them comfortable to wear, the frames are metal and remind me of Oakley sunglasses with a very oversized right earpiece.

The glasses have two buttons, one located at the back of the unit, which is for power and wake and another on the top of the hinge where the frames fold. They are were lightweight, stylish, and easy to wear for a full day.  I found them a refreshing change from other visual aids built on larger VR and AR hardware which tend to be large and uncomfortable to wear for long periods – and even worse tend to stand out in a crowd. The Envision glasses look like sunglasses with no lenses, and most people just ignore them.

Audio is delivered via a single speaker in the right earpiece. The sound quality was good, and the voice quality was on par with current technology screen readers. In most cases, I kept the volume around 30% and bystanders could not hear what it was saying, or just heard a low garble from the unit.

Features and Interface

Interaction with the Envision glasses is performed via a swipe pad located on the right earpiece just behind the hinge for the frames of the unit. The swipe pad recognizes single, double, and triple finger gestures, which may sound overwhelming at first, but with just a couple minutes of use I was able to easily navigate the menu structure. Audio descriptions and instructions were clear and concise, and tended to show up just when the user needed them most or required a bit of reminding about the functionality of a feature.

Read

The read menu option offers the user Instant text, scan text and batch scan of documents. Instant text reads any text that appears in front of the unit and is great for reading labels on boxes and bottles. I also found instant text perfectly suited for use at fast food restaurants, and it was very good at reading the menu on the wall above the cashier. Instant text did not work well for reading a menu in a sit-down restaurant. I did find that the tool will often read things that are across the room, which can be a bit confusing when you are holding a box of cereal and it starts reading the label of a can of soup across the kitchen.

Scan text and batch scan work a bit different than instant text; they give audio directions to the user to position the document to be read for optimum accuracy. This feature works great for reading a book, as you hold it up the camera clicks and takes a picture and reads the text. There is usually a short delay while the glasses prepares the text, and the accuracy was very good to excellent. Quite often the unit will spell out a word rather than read it, but that was not a big deal. Batch scan allows you to scan multiple pages into memory at a time and have them read to you. The only downside with the scan text and batch scan feature is that they require a wi-fi connection to operate. The first thing I did when I got the review unit was to go to a restaurant expecting to have the glasses read the menu to me, but since the restaurant lacked a wi-fi connection the feature did not work. I was able to muddle through with the instant text as this feature does not require a wi-fi connection to operate.

Overall, the accuracy when reading text was well above my expectations, a trick I learned was to use a blank piece of paper over the page not being scanned when using the scan text and batch scan options. This seemed to improve the unit’s ability to frame the document a bit faster.

Call an Ally

The next menu option is Call. This feature allows the user to call a pre-approved (done through the app) person for assistance. Using the camera in the glasses, the person on the call can see what you are seeing in real time and help direct you in the event you are lost or cannot find something. To test this functionality, I added my wife and son to the system, which only took a few minutes. It should be noted that this feature only functions when the unit is connected to a wi-fi signal, so when I was at the airport, I had to log into a nearby restaurant wi-fi to accurately test. I hope in future versions of the software that this feature can be activated over the phone connection, as it would then function just like navigation apps like Be My Eyes. 

Identify

Identify is next on the menu, which offers scene description, detect light, identify cash, and detect colors. Describe scene allows you to have the Envision glasses describe what it sees in the scene in front of you. The system seems to do a good job of picking out the most important thing in the scene, if there is a woman sitting in a chair by a desk with a computer on it, and a cat on the floor, the glasses will tell you: “woman sitting in a chair”, or “woman sitting in a chair at a desk”. This feature does require wi-fi, I found the feature useful in home and office settings.   

Detect light uses a tone to inform the user about the amount of light around them. A lower tone denotes a darker ambient light, while a higher tone indicates brighter light. A completely blind friend found this feature useful when preparing for bed, to determine if they had left a light on. The rising tone allowed the user to determine the direction of the light that was still on.

Detect colors performed well when identifying many colors. The glasses accurately identified the change in color of an object as the amount of light present changed. A purple ball was identified as purple with good illumination, but as the light source was moved farther away from the object, the glasses identified different shades of the color until it appeared as dark gray. A really cool feature, and great for exploring your environment. Both the detect color and detect light feature worked without a wi-fi signal. 

Find

We all lose things, and need to locate objects, and the Envision glasses are very good at finding things. The first option in the find menu is find object. The user can scroll through the menu and select the glasses to find a Bench, Bicycle, Bottle, Car, Cat, Chair, Dog, Keyboard, Laptop, Motorbike, Sofa, Table, Toilet, Traffic light and Train. This feature does not require a wi-fi signal, and I found it very useful especially for finding traffic lights, benches, and chairs. When searching for an object, the glasses emit a tone when the object is in front of you, below are some examples illustrating the distance from the object when the “found it” tone was emitted. The glasses continue to emit the tone as long as the object the user is searching for is directly in front of them.

Find people is such a cool feature of the Envision glasses. Users train the glasses to find a specific person with the smartphone app. When training this feature, the app asks for a name, then directs the user to take five pictures of the person’s face. I found a straight on, then left, right top and bottom angles gave excellent results. Once you have taken the photos, the glasses take a couple of minutes to process, then find person will identify that person with near perfect accuracy. Entering a room with a person trained in the system yields a “Looks like (name)”. This feature does not require a wi-fi connection, and I found it to be very accurate even in low light situations.

The Explore also does not require a wi-fi connection and allows the user to explore objects in their surroundings. The function was able to easily identify the objects in the find object list above. This feature is handy if you are in a new room and looking to get the layout of furniture and other things. To get the most out of this feature, the user needs to survey their environment slowly and give the glasses time to identify objects in front of it. If you move too quickly, the glasses will miss things and not read them out to you.

I assume because of the processing power required for the find functions, these features did tend to drain battery life quite a bit quicker than most other features of the glasses. I would recommend a battery pack to charge the unit if you are traveling with the glasses and relying on them to give you the lay of the land in new surroundings. 

Cool features

The Envision glasses are a capable and fun visual aid to use, allowing the visually impaired user a lens into the visual world around them. Much attention to detail has gone into the creation of this unit, one feature I really appreciated was the voice announcing the battery level when charging. The unit also conveys the battery level, time, date and the wi-fi you are currently connected to with a double tap from the home menu location.

When you purchase a pair of Envision glasses, you are scheduled for a free onboarding call with the team which allows you to ask questions and gain some insight into using the glasses to their fullest potential. Take some time to get to know your glasses before the call, the team is very helpful in answering all of the questions you may have about the functionality of the unit. I love that this is a feature for all Envision glasses purchasers, and really sets a new user off on good footing when learning to use a product like this.

New features

Shortly after I received the glasses, a software update came through which included functionality for the glasses to instruct the user in positioning documents for optimum performance when scanning. When attempting to scan a book or document, the glasses tell you to slowly move the page to the left or right, or move your head left or right. Small increments of movement are best here, otherwise the glasses tend to get a bit jumbled. Slow deliberate movements allowed the glasses to instruct and line up the material to be scanned quickly.

A second update during my testing time implemented voice commands, allowing the user to quickly access features like instant text, find object etc. To access the voice commands, the user depresses the hinge button where the large right earpiece attaches to the frame. There is a bit of a time delay when activating the voice commands, which required a bit to get used to. Once you have the timing down for issuing a command they work well even when the unit is not connected to wi-fi. I thought the Envision glasses were great with the manual menu system, as it was easy to navigate, but the addition of the voice commands makes the glasses even more useful in day-to-day use. 

Everything we own these days needs to be updated, and it feels like some of those devices can take ages to complete an update (I’m looking at you, windows), but updating the software on the Envision glasses was quick and the audio instructions were clear and easy to follow. Read more about the team’s commitment to improving the platform below.

Performance

When it comes down to it, how well a product performs its features is directly related to the likelihood of a user continuing to use the product. The Envision glasses handled the tasks I threw at it quickly and accurately. Battery life was good, and I was able to get a day’s worth of use out of a full charge with moderate use. Identifying objects in the find object list was generally quick depending on the scene (as to be expected). If the object the glasses was looking for was by itself, they identified it much quicker than if there were other objects around that could distract the identification process.

Built as a platform

Time to get a bit more technical: we here at Blind Travels love to report on technology related to low vision and blindness, so many readers here have interest in the more technical aspects of a product, and I believe the Envision glasses are well worth delving into. Recently, I spoke with a couple members of the development team about the Envision glasses project, and I was pleasantly surprised to find out that the team is treating the glasses as a platform which can be expanded upon.  

Platform agnostic

The Envision glasses are built on the Google Glass version 2 enterprise edition, but the code running the glasses is platform agnostic. What this means is that if Google decided to stop manufacturing the glass platform, Envision could pivot to another hardware platform without having to start from scratch. The team are forward-looking in terms of project longevity, which means investing in the platform as a customer or a developer appears to be a safe bet.

Developer kits

The core of the software running the Envision glasses is written with platform “wrappers”, meaning porting to support other glasses or AR type glasses hardware is something the team is considering. The team plan to make developer kits (SDK) for the glasses available as early as next year, though which development environments will be supported has not been set in stone as of this writing. I applaud this move, and as we seen with early game consoles, giving independent developers access to the development environment at a reasonable cost allows creation of some pretty cool (and often crazy) advances in the capabilities of a platform.

Future functionality

Envision are currently in talks with the developers of several well-known navigation apps commonly used by low vision and blind users with the hope they will eventually add their robust location and navigation abilities to the Envision glasses platform. 

Product specifications

From the Envision website:

Camera: An 8-MP camera with a wide field-of-view.

WIFI and Bluetooth

Battery: 5-6 hours with regular usage. USB-C supported fast charging.

Audio: Directional Mono Speaker, USB audio and Bluetooth audio.

Robust and Light: Water and Dust resistant. Weighs less than 50 grams.

Difficulties

No visual aid is perfect, and with anything there is a learning curve the user must navigate to become adept enough at using the functions of a device to take full advantage of its capabilities. The Envision glasses are no different, and though the unit has a few idiosyncrasies, none of them are deal breakers in my book.

Waking up

I did find the glasses entering sleep mode quite a bit when using them throughout the day. Users need to be aware of this because it can take quite a few moments for the system to come back online from a sleep. I surmise this is to preserve battery power, and I fully realize that there is a balance between preserving power on the unit and availability to the user. The Envision glasses are not an “instant on” device, and when the glasses are in standby mode there is no audio clue for them powering back on, something I suspect is a limitation of the hardware.  

So sensitive

The primary user interface for the Envision glasses is the swipe pad on the right earpiece. Learning the interface is straightforward and within a couple of minutes it is easy to have all the gestures down to effectively navigate the menu. The swipe pad, however, does not always recognize two-finger gestures, which are the way to access the return to main menu and volume control options. It often took multiple tries to return to the main menu (two finger swipe down), and adjust volume (two finger swipe up). When using the glasses in the read mode, I often found myself inadvertently enabling offline mode while navigating. I believe this is due to the sensitivity of the swipe pad and the location of the offline mode menu option. Not a big deal, but it did take some getting used to.

Standby mode 

The review unit I had encountered a few hiccups when powering up from a sleep or standby mode.  If you do not use the glasses for a bit or fold them up and set them on your desk, they enter sleep mode. There is often a 30 second or longer delay when the glasses power back up to a useable state. There is no sound or user notification when the unit is starting up, and at first, I found myself thinking the unit was powered off, so I held the power button to restart. I assume this is a limitation of the hardware platform the glasses are built upon. In my testing, there were also several times I could not get the glasses to power up and ended up plugging them in which caused a startup. I am not sure if this is a common issue with the glasses or just an issue with the unit I was reviewing.

Standby battery

Standby battery life for the Envision glasses was a bit of an issue. From a full charge before bed, I set the glasses on my desk and returned in the morning with around 59 percent charge. This is a concern when traveling, as I used the glasses to navigate the airport before my flight then put them away until arriving at the next airport and the glasses were just below 50 percent charge.  Not an issue, as I travel with a power brick for my USB devices, and the glasses offer fast charge, but I thought it was worth noting. Your use case and mileage will vary.  

Low light

The Envision glasses have some difficulty reading and finding objects in low light situations. I commonly encountered the glasses giving the not enough light error when trying to read menus in darker restaurants. I am sure this is a limitation of the hardware and perhaps could be improved in future revisions of the product.

Features I would like to see

The quality and accuracy of the Envision glasses is great and having a pair of glasses that are lightweight and easy to use that will read things to a blind or low vision person will be a game changer for many users. However, there is always room for improvement no matter how awesome a product is. After a couple months of testing, here is a short list of features or improvements that would be of great benefit to the platform:

Refine quick text

Quick text tends to read things far away from the user and it can be a little confusing. The unit is just reading what is in front of it, that makes sense. If the glasses had the ability to read something the user is pointing to, it would be a natural way for the user to interact with the glasses. For those who have some vision, pointing at a sign or other text could be an amazing feature. If adding a feature such as this would limit the functionality of quick text in offline mode (when the unit is not connected to the internet) then perhaps another menu option for point read etc.

Games

The Envision glasses do not recognize playing cards as of this writing. When I heard about the Envision glasses initially, I had dreams of playing poker or other card games without a sighted person to aid me. I know this functionality is being worked on at Envision, but I can imagine that teaching the AI algorithm to recognize cards will be a significant task. Just think about how many different card styles there are. Every playing card manufacturer has their own look. Envision could limit the functionality to a few card manufacturers (I smell a collaboration opportunity) and if card recognition could be implemented with the point to read option mentioned above, then low vision players could start enjoying games without braille cards.

Unify identification

Currently, users need to enter the currency menu function to identify bill denominations. It seems as though it should be able to access the data that allows the glasses to identify bills from the explore feature. This also goes for people that have been trained into the glasses system. When exploring a room, people were not identified by name, but rather man or woman, and what they are doing, like sitting in a chair. I hope future revisions of the software can make the explore or identify operations more general. If the user trains the glasses using the train envision feature, that training should roll out to the whole system, as it is a disconnect that the system can identify something in one mode but not in the others.

What do they cost?

Assistive technology can often come with a bit of sticker shock in terms of cost. The Envision glasses recently had a significant price drop to $2499 (USD), and part or all that cost may be covered by insurance depending on your plan. I have reached out to a couple different insurance providers (including my own) about the current percentage being covered and will update this review once I have some definitive numbers to share. For now, contact your provider and ask them if they cover a percentage of purchased visual aids.

Envision also recently announced that their companion app which has many of the features of the glasses is now free to use on iOS and Android. This is great news.

The verdict

As someone who has low vision, I found the Envision glasses very useful, and enjoyed the time I had with them to create this review. The Envision team are committed to improving the product, and the addition of voice commands add another layer of usefulness to an already well thought out user interface. The glasses are stylish, and lightweight with good battery life. Considering the plans the team have for improving the glasses, this seems a good bet for a useful product for anyone with low or no vision.

I’d like to take a moment to thank Envision for sending out a review unit for me to test.

Resources

If you would like more information about the Envision glasses, here is a link to the manufacturer’s website.

https://www.letsenvision.com/envision-glasses             

 

Before you go…

Thank for checking out our Envision glasses visual aid review, I love to hear from my readers, have you tried the Envision glasses, or do you have questions about my time with them? feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photographywww.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/

 


Update on COVID-19 accessible tests

blind travels logo, text and silhouette of guide dog and handler

A couple days ago I posted about free accessible COVID-19 tests which are readable through a smartphone for blind and low vision users. Today they have announced the USPS will send 12 tests in 6 packages instead of 2. Here is more information, I placed my order and they gave me information for five packages since I have already received my first order.

https://acb.org/accessible-COVID-tests-announcement

Link to order directly

https://special.usps.com/testkits/accessible


Blind Travels featured in Accessible Journeys Magazine

A heartfelt thank you  goes out to Accessible Journeys magazine for the wonderful article on Blind Travels in their summer issue. While we focus mainly on blind and low vision accessible travel, the fine folks at Accessible Travel create articles and tips geared toward a variety of disabilities. I encourage you to take a moment and check out their well produced and beautiful looking magazine. 

For blind and low vision viewers, Accessible Journeys features a built in screen reader mode, which allows users to enter a text-heavy version of the magazine which works perfectly with NVDA and Jaws. The screen reader mode also features a built in screen reader. 

Here is a link to the summer edition of the magazine.

https://reader.mediawiremobile.com/M%C3%A9lange%20Magazine%20Publications/issues/207950/viewer?page=1

Before you go…

I love to hear from my readers! Have you checked out the latest issue of Accessible Journey, let me know what you think!  Additionally, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/

 


Accessible COVID-19 tests are now available

blind travels logo, text and silhouette of guide dog and handler

At home COVID tests are great, but until now if you had little or no vision, it was impossible to read the results of the tests.  The test results are read through an app on your smartphone vis Bluetooth. The test is compatible with iPhones and Android smartphones, but not Android tablets. An iPhone needs iOS 12 or later, and an Android phone needs Android 6.0 or later and Bluetooth 4.0 or later to run the tests. The link below has a list of compatible devices from the manufacturer. 

Click to access COVID19-Home-Test-List-of-Compatible-Devices_Final-B.pdf

According to the manufacturer, complete audio instructions are provided in the app to aid in connecting to and obtaining the test results for the test units. I have ordered mine, and will report back with more information. 

You can sign up and order your tests from the USPS website at the link below, or by calling: 800-232-0233. Each order is for one package of two accessible tests, and there’s a limit of one order per address. 

https://special.usps.com/testkits/accessible

If you would like more information about the program or the tests, here is an article from seniorsmatter.com

https://www.seniorsmatter.com/free-accessible-covid-19-tests-now-available-for-visually-impaired-people/2613400/

Before you go…

I love to hear from my readers! Have you tried the new accessible rapid COVID-19 tests? What do you think? Additionally, if you have questions about this article or any other content on Blind Travels, feel free to drop me a note on my social media links below or right here on blind travels. Follow me and I will happily follow you back.

My Photography: www.tahquechi.com

Twitter: @nedskee

Instagram: @nedskee

Blind Travels on Facebook: https://www.facebook.com/blindtravels/

 


© 2024: Blind Travels | Travel Theme by: D5 Creation | Powered by: WordPress
Skip to content