Do Ray-Ban Meta smart glasses work as an accessibility device?

0
29

Do Ray-Ban Meta smart glasses work as an accessibility device for people with low or limited vision?

A woman wearing Ray-Ban Meta smart glasses.

Credit: Meta

The Ray-Ban Meta smart glasses were first unveiled in 2023, the result of a collaboration between sunglasses company Ray-Ban and tech giant Meta, owner of Facebook and Instagram. Appealing to the fashion-conscious tech nerd, the voice-operated wearable not only allows users to take photos and make calls hands-free, but can also use AI to describe a user's surroundings.

Though the Ray-Ban Meta was not designed as an accessibility device, its features may cause some to wonder whether it could moonlight as an accessibility device for people with low or limited vision. As such, Mashable spent a few days testing whether the gadget could be reappropriated for this purpose. 

Unfortunately, while it is a novel device, relying on the Ray-Ban Meta smart glasses to help you navigate the world would be foolhardy at best.

What are Ray-Ban Meta smart glasses?

A pair of Ray-Ban Meta smart glasses on a white background.

Credit: Meta

The Ray-Ban Meta glasses boast a relatively compact form factor which looks very much like Ray-Ban's eyewear designs, with customers able to choose between Wayfarer, Skyler, and Headliner styles. The glasses utilise Meta's large language model Meta AI to answer users' queries, with a five-microphone system which can pick up voice commands while suppressing background noise. They also have small open-ear speakers designed to minimise audio leakage, and include a built-in 12 MP camera which can take photos and record video.

Despite its high-tech innards, the most noticeable visible difference between the Ray-Ban Meta and standard Ray-Bans is the missing metallic detail at the temples. Instead, the Ray-Ban Meta substitutes in a camera lens on the left and a notification light on the right (this activates when a photo or video is being taken, an effort to address concerns about privacy and covert surveillance).

At around 49 grams depending upon the frame selected, the Ray-Ban Meta's weight isn't outside what one may expect for a pair of sunglasses, though it's certainly on the heavier side. It is slightly bulkier than standard Ray-Bans, particularly at the arms (the right of which includes touchpad controls), but still streamlined enough that observers likely won't notice.

What accessibility features do Ray-Ban Meta smart glasses have for people with low or limited vision?

A man wearing Ray-Ban Meta smart glasses.

Credit: Meta

Ray-Ban Meta glasses are targeted at the average consumer, rather than catering specifically to people with disabilities. Even so, Meta does state that the glasses can be used to help people with "reduced vision, hearing, or mobility by offering the ability to perform tasks hands free." Users can also have their Ray-Ban Meta glasses fitted with prescription lenses, with the option to upload a valid prescription with a power ranging between -6.00 and +4.00 when ordering a pair. 

The Ray-Ban Meta glasses are primarily operated by voice commands to Meta AI, requiring the app to be installed to your phone and connected to the device. The glasses can also be connected to Messenger, WhatsApp, or a users' phone via said app, which is needed to enable users to send messages and make calls using voice commands. This may help users conduct such tasks without having to look at their phone screen, however it's worth noting that both iPhone and Android can already be operated directly via voice commands without Meta AI.

Ray-Ban Meta users can also issue commands, such as asking Meta AI what they're looking at. The glasses will then take a photo and use Meta AI to analyse it, with AI-generated audio describing the scene to the user. Such images and conversation logs are saved to a user's History log in the Meta AI app, and can be shared to the public Discovery feed.

Ray-Ban Meta's Be My Eyes partnership connects users to volunteer helpers 

Aside from this voice command functionality, the Ray-Ban Meta feature most specific to people with disabilities is its partnership with Be My Eyes. This free service connects users with low or limited vision to volunteers who will look through the Ray-Ban Meta's camera and describe the person's surroundings. According to Be My Eyes, it is "the first and only accessibility tech for blind or low vision users available on Meta AI Glasses."

The Be My Eyes app does work without Ray-Ban Meta, with users simply pointing their phone cameras at whatever they want described. As such, people who primarily want to take advantage of this free service could just download the app to their phone rather than shelling out a few hundred dollars for the Ray-Ban Meta. Using Be My Eyes with the Ray-Ban Meta requires users to download the app and connect it to the Meta AI app anyway.

However, the Ray-Ban Meta glasses do enable users to use Be My Eyes hands-free. They may help frame shots as well, as users merely have to direct their gaze toward whatever it is that they want described to them. Whether it's worth picking up the Ray-Ban Meta to assist in accessibility may depend on how often a user utilises Be My Eyes. Even so, Meta states that the glasses have just four hours of battery life with moderate usage, which means that wearing them all day in order to repeatedly use Be My Eyes may not be realistic. Be My Eyes is also only available in the U.S., UK, Canada, Australia, and Ireland, and only supports the English language.

Mashable Light Speed

Meta AI isn't intended to be an accessibility device, and it shows

Two screenshots of Meta AI telling a person what they're looking at.

Credit: Amanda Yeo

Unfortunately, aside from its Be My Eyes functionality, the Ray-Ban Meta glasses seem largely unsuitable as an accessibility aid. While Mashable found them an interesting novelty at least (though the Meta AI app's Discovery feed felt like the quiet death of humanity), relying on these glasses to help you navigate the world is an impractical proposition.

As previously mentioned, the Ray-Ban Meta glasses can describe a user's surroundings when asked. However, such responses are relatively vague and don't appear useful for orienting yourself unless you're so lost that you can't tell whether you're in a car park or a playground. For example, Meta AI responded to one query by telling me that I was "looking at the interior of a train, specifically the seating area." While this was true, it wasn't terribly useful information, and missed the display I was facing, which indicated where the train was going.

Three screenshots of Meta AI trying to read an article.

Credit: Amanda Yeo

When asked to read a sign bearing a single word, Meta AI was able to do so. As such, it may be useful to help someone determine the appropriate bin to throw their waste in, for example. However, asking it to read an article open on a computer screen produced unsatisfactory results. Looking at the first paragraph of my colleague Belen Edwards's article "The 10 best TV shows of 2025 (so far), and where to stream them," I requested my Meta RayBan glasses read it to me. The result was a bizarre mix of text out of order, with some lines skipped altogether. When I scrolled down and asked it to continue reading, it would simply recite the text it had already read.

Asking again on another day produced even less accurate results. Instead of reading the text, Meta AI offered a vague description of what it seemed to think the article was about. Repeated requests produced different results each time, with Meta AI sometimes even telling me "the text reads" before offering an inaccurate approximation of the text.

Further tests on a later date showed improved accuracy, with Meta AI reciting much of the article visible on screen. Even so, it still took liberties with the text, inventing a headline and referencing fake shows "The Pilt" and "The Iceberg." After I scrolled down and asked Meta AI to continue reading, it stated that it "can only provide general information, and [is] not able to read articles in real-time."

Being able to simply look at any screen and have Ray-Ban Meta glasses smoothly read it out would theoretically be a boon to many users with low or limited vision. Unfortunately, people who need such assistance would be better off relying on dedicated screen readers for now.

News headlines get muddled by Meta AI

Six screenshots of the Meta AI app being asked for the news.

Credit: Amanda Yeo

Mashable's testing found the Ray-Ban Meta glasses' AI assistant also struggled when it came to matters that weren't literally in front of it. When we tried asking for the day's news headlines, Meta AI confidently offered a humorously incoherent response: "Here are the top three news. First, latest news and stories from around the world are available. Second, latest U.S. news updates are available. Third, latest news headlines are available." Repeating the question produced the same answer.

Asking for a specific publication might get you actual news items, however they may not be from the outlet you requested. While Mashable didn’t report on Jonathan Joss’ death, Meta offered this news as the top headline on the site at time of testing. It then offered Mashable’s coverage areas of "tech, culture, and everything in between" as a second ostensible headline, before again informing us that "the latest news headlines are available."

Requests for the New York Times' headlines fared better, producing news items that the publication had reported on. However, the given headlines seemed to have been paraphrased, and the information supplied was outdated at best. For example, Meta AI stated that "Israel appears ready to attack Iran," however the first story on the New York Times' website was "Israel Says It Attacked Headquarters of Powerful Iranian Military Unit." Further, while Meta AI stated that 242 people had been killed in a plane crash in India, the death toll had already climbed to 270 days prior to our inquiry.

I also tried asking Meta AI for a recipe for a vanilla cake. In response, it provided a list of ingredients and measurements which seemed to be in roughly the right proportions. However, it only partially fulfilled my request, as no instructions were provided. Once again, Meta AI demonstrated approximate knowledge of many things, but was still unable to offer useful, usable information.

Meta AI struggles with dinner and travel plans

Screenshots of the Meta AI app showing a user asking for train timetable information, with Meta AI directing them to the Transport NSW website.

Credit: Amanda Yeo

Meta AI also struggled with more personally immediate matters. While it did suggest a nearby restaurant when asked for a "good place to go and get dinner in Sydney," Meta AI stated that it would be open until 10 p.m. that day. In actuality, the restaurant had been closed for months, which was reflected both on their Google Maps listing and Instagram page. Despite there being hundreds of operating restaurants in the city, Meta AI somehow managed to select one that had shut down.

The chatbot also fell short when asked to assist with travel plans. Requesting help getting around seems like an obvious and expected use of an AI assistant. Despite this, Meta AI was unable to assist when asked when the next train between two stations would arrive, stating, "I can't help with that yet, but I'm learning every day!" When asked for assistance with transport plans more generally, it told me to visit my local transport website to check the timetable. It couldn't advise if or when such features might be added either, telling me to check the Ray-Ban Meta Help Center.

Asking whether a train station was wheelchair accessible was hit and miss, with Meta AI bizarrely responding to my first request by offering the address for a KFC. Fortunately, subsequent inquiries produced more relevant answers, however considering the quality of previous responses, users will probably feel uneasy about blindly trusting Meta AI's word for it.

The Ray-Ban Meta may be an interesting toy, but it isn't an accessibility device

A woman wearing the Ray-Ban Meta smart glasses as sunglasses.

Credit: Meta

The Ray-Ban Meta smart glasses aren't primarily marketed as an accessibility device. Actual medical devices designed to assist people with low or limited vision typically retail for a significantly higher price. For example, an OrCam MyEye 3 Pro will drain your bank account to the tune of $4,490, which is over 10 times the price of the most expensive Ray-Ban Meta glasses.

In light of this, it's unsurprising that the Ray-Ban Meta glasses underwhelm as an accessibility device for people with low or limited vision. While the Ray-Ban Meta glasses may assist users by enabling them to conduct tasks such as messaging, playing music, and taking photographs hands-free, they struggle when asked to interpret text in front of them and underperform when asked to provide information more generally. Like all generative AI algorithms, Meta AI simply can't replace going direct to reliable sources yourself.

If you just want to take a few hands-free photos and calls, the Meta Ray-Bans may have you covered. However, this gadget wasn't designed to be an accessibility device, and certainly should not be relied upon as such.

Amanda Yeo

Amanda Yeo is an Assistant Editor at Mashable, covering entertainment, culture, tech, science, and social good. Based in Australia, she writes about everything from video games and K-pop to movies and gadgets.

Căutare
Categorii
Citeste mai mult
Technology
AI baby videos are going viral. Are they cute, creepy, or cringe?
AI baby videos are going viral. Here's how to make your own, if you dare....
By Test Blogger7 2025-05-29 01:00:36 0 789
Home & Garden
The 6 Best Medicine Cabinets for a Clutter-Free Bathroom
The 6 Best Medicine Cabinets for a Clutter-Free Bathroom We independently evaluate all of our...
By Test Blogger9 2025-06-14 08:00:28 0 454
Technology
The Amazon Fire HD 8 Plus tablet is the cheapest its ever been ahead of Prime Day
Best budget tablet deal: Get the Amazon Fire HD 8 Plus for $55...
By Test Blogger7 2025-06-20 16:00:14 0 192
Jocuri
Brookhaven codes May 2025
Brookhaven codes May 2025 As an Amazon Associate, we earn from qualifying purchases and...
By Test Blogger6 2025-05-30 14:00:18 0 604
Technology
What is Liquid Glass? Apple debuts a beautiful new interface for iOS and iPadOS
Apple introduces Liquid Glass to make iOS and iPadOS more expressive...
By Test Blogger7 2025-06-09 19:00:20 0 474