The smartphone changed how people interact with the world by putting a computer in their pocket. Meta is betting the next shift is simpler: putting it on your face instead.
Developed in partnership with eyewear giant EssilorLuxottica, its second-generation AI-embedded glasses can answer a question, translate a sign or describe a room without the wearer touching anything.
They go on sale today across Ray-Ban and Oakley stores, Sunglass Hut outlets and authorised optical retailers in Singapore, making it the first Southeast Asian market to carry both lines. A dedicated pop-up runs at Orchard Road from May 6 to 17.
The pairing of Meta's technology with EssilorLuxottica's portfolio gives each company something it cannot produce alone. Meta gets frames people already want to wear. EssilorLuxottica, which owns Ray-Ban, Oakley and Sunglass Hut, gets an AI platform embedded in its distribution network.
“As one of the world's most AI-advanced economies with a thriving tech ecosystem, Singapore is well-positioned to adopt innovative technologies that can enhance daily living. Together with Meta, we are inviting people to experience how they can connect with the world and others more seamlessly and effortlessly,” says Denis Chng, country manager for Malaysia and Singapore at EssilorLuxottica.
Ray-Ban Meta covers everyday wear through three styles — Wayfarer, Skyler and Headliner — starting from $629, with sun, clear, polarised and Transitions lenses available, the latter from $759.
See also: Brand visibility has a new battleground beyond search engines
Oakley Meta targets performance. The HSTN, from $659, is built for active use with IPX4 water resistance and Oakley's PRIZM lens technology, which sharpens contrast across changing light conditions.
The Vanguard, from $819, is engineered for high-intensity training with IP57 dust and water resistance, a 124-degree ultra-wide camera capable of 3K video, Strava and Garmin integration for real-time performance tracking, and up to nine hours of battery.
Both Oakley models share a 12-megapixel camera, open-ear speakers and voice access to Meta AI.
See also: IMDA steps up support for local university students entering tech careers
“Every generation, computing gets closer to us — from mainframes to desktops to smartphones. Now, AI meets you at eye level. We are building personal superintelligence for everyone, and glasses are the ideal way to experience it because they understand your context in real-time," says Nicole Tan, managing director for Meta Singapore.
(From left) Ivan Chuavon, vice president of marketing for Southeast Asia and Korea at EssilorLuxottica and Nicole Tan, managing director for Meta Singapore. Photo: Albert Chua/ The Edge Singapore.
When the glasses become a guide
For most buyers, these smart glasses offer convenience, but for blind and low-vision users, they offer access.
Meta’s partnership with Be My Eyes, which went live on its smart glasses in 2024, allows users to connect hands-free to sighted volunteers who can describe what the wearer is seeing, entirely by voice and without needing to handle a phone.
A March 2026 update extended that reach to personal contacts and selected businesses, so the same command can support everyday assistance or connect users to customer service teams at companies, including Hilton and Sony, where available.
To stay ahead of the latest tech trends, click here for DigitalEdge Section
The Oakley Meta Vanguard adds a dedicated action button that triggers these calls without any spoken command. For users with partial vision, the smart glasses also support Read Text Verbatim and Detailed Descriptions, features that narrate text and scenes through the open-ear speakers on demand.
The hardware makes this possible in ways earlier assistive devices could not. The camera captures the wearer's point of view continuously. Open-ear speakers relay responses without blocking surrounding sound, keeping users aware of their environment. Voice commands handle AI queries and calls for help without interrupting movement.
Where the AI falls short, Be My Eyes fills the gap. A volunteer who can be questioned and redirected in real time offers a different order of reliability from a system that can misread context or return the wrong answer.
That combination lands in a city where the need is documented. Findings from the Singapore Epidemiology of Eye Diseases study show that visual impairment develops in about one in 10 adults over 40 within six years, with age and income as the strongest predictors.
Singapore’s population is ageing, and that segment is set to grow. National initiatives such as the Enabling Masterplan 2030 have placed greater emphasis on inclusion, and agencies such as SG Enable have supported trials of assistive tools. What has been less common is a consumer device that connects users to real-time help in a form they might choose to wear daily. Until now, that device has looked like a phone, but it may soon look like a pair of smart glasses.

