Back in September during Meta Connect, the company previewed a new ability for its smart glass lineup called Conversation Focus. The feature, which is able to amplify the voices of people around you, is now available to roll out of the company’s latest batch of software updates.
When enabled, the feature is intended to make it easier to hear the people you’re talking to in a crowded or otherwise noisy environment. “You’ll hear the amplified sound a little brighter, which helps you distinguish the conversation from the surrounding background noise,” Meta Explains. This can be done through voice commands (“hey Meta, start Conversation Focus”) or by adding it as a dedicated “tap-and-hold” shortcuts.
Meta also adds a new multimodal AI feature for Spotify. With the update, users can ask their glasses to play Spotify music that matches what they’re watching by saying “hey Meta, play a song to match this view.” Spotify then starts a playlist “based on your unique taste, tailored for that specific moment.” For example, looking at holiday decorations might trigger a similarly themed playlist, though it’s unclear how Meta and Spotify translate the more abstract concepts into themed playlists.
Both updates are now rolling out to Meta Ray-Ban glasses (both Gen 1 and Gen 2 models), as well as Oakley Meta HSTN frames. The update will come first to those enrolled in Meta early access program, and will be available “slowly” to everyone.
Meta’s latest take on smart glasses, the Oakley Meta Vanguard shades, also got some new features in the latest software update. Meta added the option to trigger certain commands with a single word, instead of saying “hey Meta.” For example, saying “photo” is enough to take a photo and saying “video” will start a new recording. The company says the optional feature is intended to help athletes “save breath” while running or cycling.








