Meta's Ray-Ban glasses are about to get a lot smarter

In This Article:

Meta (META) CEO Mark Zuckerberg showed off some impressive upgrades to the company's hit Ray-Ban Meta Glasses during the company's Meta Connect event on Wednesday, including the ability to translate live conversations, offer continuous talking features, and recognize the world around you.

The updates make the Ray-Ban Meta Glasses, which start at $299, a far more intelligent piece of technology — and give them similar features to smartphone-based AI apps like Google's (GOOG, GOOGL) Circle to Search and Gemini and Apple's (AAPL) upcoming improved Siri and visual intelligence.

According to Meta, you'll now be able to do things like ask the glasses to remember where you parked. Or, if you want to know about landmarks in a city you're visiting, the Ray-Ban Meta Glasses can tell you more about a statue, building, or location when asked. At the grocery store and want to know what to make for dinner? Meta says the glasses can suggest what you should make based on what you're looking at while walking the aisles.

Meta is updating its Ray-Ban Meta Glasses with new AI capabilities. (Image: Meta)
Meta is updating its Ray-Ban Meta Glasses with new AI capabilities. (Meta) · Meta

The company also says it's adding real-time translation to the Ray-Ban Meta Glasses. Zuckerberg showed off the feature during a live demo. And while there was a small delay between when a person stopped talking and the glasses began translating, it certainly appeared to work well. In fact, live-translating a conversation with the glasses looks far more natural than doing the same thing while holding up a smartphone.

Finally, Meta showed off a new limited-edition transparent design for the Ray-Ban Meta Glasses that lets you see the device's internal components. It gives the glasses a late-90s/early-2000s look that more tech companies should adopt. (Says the guy who grew up during that time.)

In addition to updates for the Ray-Ban Meta Glasses, the company also announced its first AI vision models. The new open-source models, called Llama 3.2 11B and 90B, are designed to be able to understand images like charts and graphs.

The idea is to give developers the ability to create apps using the new models that can, for example, provide a hiker with information about when a particular hike gets steeper by having the AI read over a map of the route.

Meta's AI work has quickly made it one of the top open-source AI companies around. And if it can keep on innovating, it could stand among the biggest beneficiaries of the AI revolution.

Subscribe to the Yahoo Finance Tech newsletter.
Subscribe to the Yahoo Finance Tech newsletter. · Yahoo Finance

Email Daniel Howley at [email protected]. Follow him on Twitter at @DanielHowley.

For the latest earnings reports and analysis, earnings whispers and expectations, and company earnings news, click here

Read the latest financial and business news from Yahoo Finance.