Meta’s multimodal AI Ray-Ban glasses called look and ask is a smart one that uses multiple modes of information such as image , text , speech etc to process information. It’s currently in its early access for users in the US and it’s still in a testing phase. It is limited to a small number of people who opt in.
How it works
Multimodal AI Ray-Ban glasses takes advantage of the camera on the glasses in order to give you information on a question you’ve asked and also about the world around you.
Meta AI glasses are crazy! Its multi modal AI features can tell you about things Meta AI’s assistant can see and hear through the camera and microphones of the glasses.
Mark Zuckerberg demonstrated where he asked the glasses to suggest pants that would match a shirt he was holding while in his closet. It responded by describing the shirt and offering a couple of suggestions for pants that might complement it.
Also, Marquees Brownlee demonstrated by asking Meta to look and tell what it sees. He went like this; Hey, Meta, look and tell me what you see.” And Meta responded by telling him exactly what was standing before it.
Zuckerberg said that people would talk to the Meta AI assistant “throughout the day about different questions you have,” in an announcement.
MKBHD also demonstrated by speaking to a dying plant and asking Meta what to do in order to bring the plant back to live and it responded accordingly.
Meta AI’s assistant can also be used to caption photos you’ve taken or ask for translation and summarization
Use Meta AI Ray- Ban on the go! It is still on beta testing meaning that it’s not working perfectly for now.
Stay tuned as we bring you updates on the Meta AI Ray-ban look and ask! You can read othe tech news on our platform.