Meta has initiated an early access test for the highly anticipated multimodal AI features on its Meta Ray-Ban smart glasses. Users will now have the opportunity to experience the advanced capabilities of the AI assistant, allowing it to interpret and provide information based on what it sees and hears through the glasses’ camera and microphones. Mark Zuckerberg showcased the update in an Instagram reel, demonstrating tasks like seeking fashion advice, language translation, and image captioning.

Key Announcements:

  1. Multimodal AI Features Unveiled: Meta has started the rollout of multimodal AI features for Ray-Ban smart glasses, enabling users to interact with the AI assistant for tasks such as fashion recommendations, language translation, and image captions.
  2. Practical Demonstrations by Zuckerberg: Mark Zuckerberg demonstrated the capabilities in an Instagram reel, asking the glasses to suggest matching pants for a shirt he was holding. The AI assistant accurately described objects and provided helpful suggestions.
  3. Features Discussed in September Interview: The multimodal AI features were initially discussed by Zuckerberg in a September Decoder interview with The Verge’s Alex Heath. The AI assistant is designed to answer questions throughout the day, offering insights into the user’s surroundings.

AI Assistant Capabilities:

  • Fashion Recommendations: Users can ask for suggestions on clothing combinations based on what they are holding.
  • Language Translation: The AI assistant is capable of translating text in real-time.
  • Image Captioning: Users can seek assistance in captioning photos taken with the glasses.
  • Object Description: The AI assistant accurately describes objects within the glasses’ view, as demonstrated with a California-shaped wall sculpture.

Limited US Early Access Test: The early access test will be available to a limited number of users in the United States who choose to opt in. Meta’s CTO, Andrew Bosworth, clarified that the test phase aims to gather valuable feedback from users. Instructions for opting in can be found on Meta’s official platform.

Future Possibilities: The AI features represent a step forward in wearable technology, bringing practical AI capabilities to smart glasses. Meta envisions users seamlessly interacting with the AI assistant throughout the day, making inquiries about their surroundings and receiving real-time assistance.

In conclusion, the early access test for multimodal AI features marks a significant milestone for Meta’s Ray-Ban smart glasses, offering users a glimpse into the future of wearable AI technology.

By Impact Lab