
At the Augmented World Expo 2025, Snap Inc. officially announced that its next-generation Spectacles, simply called “Specs,” will launch publicly in 2026. The new Specs are a lightweight, high-performance wearable computer designed as everyday glasses with see-through lenses, bringing augmented reality (AR) and artificial intelligence (AI) directly into the physical world.
Evan Spiegel, Snap’s co-founder and CEO, said: “We believe the time is right for a revolution in computing that naturally integrates our digital experiences with the physical world, and we can’t wait to publicly launch our new Specs next year. We couldn’t be more excited about the extraordinary progress in artificial intelligence and augmented reality that is enabling new, human-centred computing experiences. We believe Specs are the most advanced personal computer in the world, and we can’t wait for you to see for yourself.”
Snap says it has invested over $3 billion and 11 years of R&D into bringing Specs to life. Unlike smartphones, which confine digital interaction to a flat screen, Specs are built to make computing spatial, intuitive, and hands-free. They will allow users to browse, stream, interact, and play without reaching for a device.
The fifth-generation Spectacles, launched in 2024 for developers, laid the groundwork for this upcoming consumer version. Snap’s AR platform now sees over 8 billion lens interactions daily, with 400,000 developers creating more than 4 million lenses.
Specs are designed to understand the world using advanced machine learning and offer context-aware assistance, game overlays, and creative tools. Whether you’re cooking, travelling, or exploring a city, Specs aim to blend digital interaction seamlessly with real-life experiences.
Example applications already in development include:
• Super Travel (Gowaaa): Translates signs, menus, and currencies for global travellers.
• Drum Kit (Paradiddle): Offers guided drum lessons with visual overlays.
• Cookmate (Headraft): Suggests recipes and provides step-by-step cooking assistance.
• Pool Assist (Studio ANRK): Helps improve pool game strategies in real time.
• Wisp World (Liquid City): Lets users go on imaginative, AR-powered explorations.
Snap also announced several updates to its Snap OS and Lens Studio tools:
• Deep integrations with OpenAI and Google’s Gemini allow developers to build multimodal AI-powered experiences using speech, vision, and context.
• Depth Module API enables accurate 3D anchoring of digital content in the physical world.
• Snap3D API supports real-time generation of 3D objects in AR.
• Speech Recognition API brings real-time transcription in 40+ languages, including high accuracy for non-native accents.
Additional tools include a Fleet Management app for monitoring multiple Specs devices, a Guided Mode for immersive location-based experiences, and Guided Navigation for AR-powered museum and city tours.
Snap is also working with Niantic to bring Spatial VPS (Visual Positioning System) to Specs, creating a shared AI-enhanced map of the world. WebXR support in browsers will allow developers to test and deploy AR experiences more easily.
Companies like Enklu, which runs Verse Immersive theatres in the US, are already integrating Specs into their entertainment platforms. In Chicago, customers can now use the glasses to play SightCraft, with more cities to follow.
For developers interested in building AR apps for Specs, Snap is inviting early access sign-ups at spectacles.com/lens-studio.
For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine