Insights From The Blog

Beyond the Headset: The Evolution of Smart Glasses and AI Integration

Undoubtedly, the prize for the fastest growing technology of the past eighteen months has to go to Smart Glasses.  

From starting as a bit of a technological oddity a couple of years ago, Smart Glasses have become the must-have accessory for many and the next iteration are promising to be even more powerful and capable as AI becomes firmly entrenched in the software. Let’s have a look at how Smart Glasses have evolved and what we can hope to expect from the tech in the next iterations.

The Rise of Smart Glasses

Aside from early attempts to put technology on our faces, such as the ill-fated Google Glass back in 2010, Smart Glasses really became useful with the launch of the Ray-Ban Stories range in 2021, and they gained a modest following amongst social media users, but not really anyone else. 

Not dissuaded by the relatively poor sales, Meta were convinced that Smart Glasses had a market, and pushed forward with the upgraded Ray-Ban Meta glasses in 2024. These were much more of a commercial success, shifting over 900,000 units in the last quarter of 2024 alone. However, it is a fickle market and we have seen both a reimaged Google Glass and the Bose Frames fail commercially over the last few months.

Some have argued that Ray-Ban and Meta hit a market sweet-spot, with a combination of cool design – who doesn’t like Wayfarers – and powerful Meta software, which outshone the opposition and made the Ray-Ban’s the Smart Glasses to have. Part of the reason for this was the superb specifications for not too much dollar.

The Ray-Ban Metas come with a 12Mp camera, five surround microphones, open-ear speakers on each arm, significant computing power with 32Gb of RAM, Wi-Fi, Bluetooth, and sufficient battery for four hours use. That’s a pretty good specification, but gets even better with the ability for the glasses to connect with your phone for Amazon Music, Spotify, Facebook, and Instagram accounts for instant playback and social media posting. It was brilliant; all of this entertainment in the space of a set of sunglasses, and only increasing the overall weight of the Metas by just five grams over the standard Wayfarers. But the real icing on the cake, and the reason why so many other companies are trying to get in on the act, is the integration of AI into the glasses and the ability to get information about anything you see or hear. If there is one thing that Meta do well, it is AI.

AI is the Answer

Simply having glasses that can play music via your phone and help you post on your accounts is a neat trick, but probably not sufficient for massed-market appeal. However, being able to integrate the power of AI into face furniture has a lot more appeal, and that has got other developers interested. The AI manifests as an intelligent helper who can answer your questions about anything you can think of. Using the prompt ‘Hey Meta’, you can ask things such as which film Martin Scorsese directed in 2013 (it was “The Wolf of Wallstreet”), or which is the solar systems eight planet (Neptune), and the answer is delivered straight to you via your Smart Glasses. But that is only the tip of the iceberg.  Meta now also allows a feature called “look and tell me…” with which the user can look at something – a car, a food dish, a monument, an animal etc – and upon giving the command, the glasses take a picture and use AI to tell you what you are looking at. The AI is sufficiently powerful to be accurate about anything you can see. In an art gallery, the AI can make you seem to be the smartest person in the room, as it relays facts about whatever painting is in front of you. The possibilities are boundless.

But the potential to become an essential piece of tech has been sealed with the addition of reliable real-time translation through the glasses. This function requires the download of specific language files, but once installed they are there forever. Translation is started by the command, ‘Hey Meta, start translation’ and, with a slight delay as the AI converts the language being heard in real time, the target language is read out in English by the device. Of course, this all falls down when, having understood what the person has said, you can’t reply in their own language, but we English have become experts at making ourselves understood through being loud and gesticulating wildly. However, that may become a thing of the past with the next iteration of Smart Glasses with integrated screens to display information to the user.

Because of their growing popularity, many big tech companies, like Apple, Amazon, and Google, are working on smart glasses right now. The fact that Vuzix, XREAL, and Rokid have already developed models shows that the market is booming. Other companies in the field include Xiaomi, TCL, and startups like Magic Leap, who are working on a wide range of functions, from cameras and AI help to virtual displays and uses in industry.

What To Expect

The next iteration of Smart Glasses is already hitting the markets and we are already seeing huge upgrades on the current tech. While the Ray-Ban Metas are the current benchmark and their growing popularity has encouraged others to get in on the act. There are plenty of other companies with their own AI engines, and many are turning their attention to Smart Glasses as a means of delivering their products. In terms of innovation, there are several areas where we expect developers to focus, including:

  • Enhanced displays. The next-generation Smart Glasses are expected to use new display technologies such as Micro-LED and LCoS to provide brighter, sharper visuals with larger fields of view. Some models may even begin using transparent screens to project information directly into the wearer’s field of vision.
  • Novel input methods. Moving beyond the current speech and gesture controls, novel approaches such as neural bracelets are emerging. These bracelets detect muscle signals using electromyography (EMG), allowing control with delicate finger gestures even when cameras are not present. Eye-tracking and possibly even brain-computer interfaces could be investigated for more natural interaction.
  • Enhanced AI functions. Smart glasses are growing more and more dependent on AI, which makes capabilities like voice help, real-time translation, and contextual awareness possible. AI can look at the world around you and offer help before you need it, like reminding you to get groceries when you’re near a store. AI can do things like answer questions, maintain your diary, and come up with good gift ideas for your partner’s birthday. The future Smart Glasses will probably include fully-integrated AI that will make them into a personal assistant that knows you better than you do.
  • Better battery technology. The limited battery life of smaller technological devices has long been a major complaint. Nonetheless, improvements in power management and extended battery life are being made possible by innovations in energy-efficient CPUs, power-saving modes, and maybe even energy harvesting features.
  • Robust connectivity. Expect seamless 5G from the beginning but integration of the next standards in Wi-Fi as soon as it becomes commercially available. Lightning-fast downloads and connections anywhere in the world.
  • Increased onboard storage. While super-fast connectivity may allow for cloud uploads, the relatively low expense of hard storage would mean that even a small set of glasses could have on-board solid-state storage of terabytes. Why wouldn’t you?
  • Real-time translations. This has got to be the holy grail of wearable tech. The idea being that your Smart wear listens while a foreign speaker is talking and gives you a real time translation in your chosen language. The problem comes when you want to reply but, assuming that the other person is not also wearing Smart Glasses, this will soon be catered for by small screens located in the glasses. The reply could be shown on this phonetically for the screen for the user to stumble over.   

But these are seen as essentials and there are many other areas where the hardware will integrate seamlessly with the AI, such as contextual awareness, where Smart Glasses will become increasingly aware of their surroundings, providing real-time information based on location, activities, and user preferences. We are also likely to see virtual wayfaring as a pretty standard feature, with an AR indicator being displayed on the inside of lenses, showing the user turn-by-turn steps.

All of these are exciting features, and are entirely plausible in something the size of a pair of glasses. Certainly, small doesn’t mean not-powerful and with the potential for layered screens on the inside of Smart Glasses, the potential for powerful XR Apps is boundless. 

Coming Soon

So, are we actually anywhere near this with the current crop of new developments? Well, Meta may be innovators but they are also not a company to rest on their laurels, and if the rumours are true, then their next Smart Glasses are going to be astounding. 

There will probably be a design for Meta smart glasses with a display coming out shortly. Informed rumours say that Meta will release Smart Glasses with an integral screen by the end of this year. Dubbed “Hypernova“, these are claimed to perform everything that Ray-Ban Metas do, plus they would run apps and show images on a small screen that is projected onto one of the lenses. They are said to come with a “neural” wristband controller for gesture control, similar to the one shown in the Orion demos. The price was said to be between £800 and £1,200. In addition, Meta is also working on upgraded Ray-Ban glasses with renders of the potential designs popping up on the web. Not much is known in terms of details, but they should be a significant leap forward. We also have Project Orion coming up, which will integrate AI with AR in a glasses package. Check back here for more details.

Away from Meta, there is plenty more happening as other companies start to get serious with Smart Glasses.

Snap Inc., the owners of Snapchat, have been players in the Smart Glasses market for a few years and are now on the fifth generation of their “Spectacles”. Kinda big, sort of ugly, these devices run on Google Gemini and the bespoke Snap OS, which makes a pretty good job at overlaying information and graphics on the lenses. Snap is constantly honing these and the sixth generation is expected in 2026, which the company claims will be less bulky and will have punchier AI.

Xreal are hoping to hit the ground running next year with their Project Aura Smart Glasses that promise to be groundbreaking in terms of performance and AI integration. The glasses are expected to be the first eyewear to be equipped with the powerful Android XR system, and that is likely to make them groundbreaking. The standard Air 2 Ultra models are expected to be supplemented by the top of the range One Pro models that will include elements such as prescription lenses and Bose surround-sound. 

Brand new Chinese start-up Even Realities are on the verge of launching their Even G1 Smart Glasses. Powered by the company’s bespoke OS, these are expected to project information onto the lenses allowing for direct language translation and other features like precision way-points for navigation. The effect is that the actual text is being projected a couple of metres in front of the user, making them comfortable to use, rather than squinting at small text in the corner of the lens. The OS will apparently be able to use localised AI for simple tasks but will be able to tap into cloud-based AI like ChatGPT if something more is required.  

Magic Leap are now major players in the Smart Glasses market and while their tech may look a bit odd compared to more fashionable entries, they are hugely powerful and pack a huge punch. The Magic Leap 2 AR glasses are actually amongst the most powerful currently available and are marketed as a commercial solution rather than a more light-weight fashion accessory with abilities, as many Smart Glasses are. We have reported on the Magic Leap glasses before, and no doubt we will be revisiting these as they start to outshine the rest of the market.

Google Project Martha glasses are on the horizon and expected before the end of the year and they are likely to be game-changing. Revealed at Google I/O 2025 and with Android XR and Gemini AI included, these are expected to be powerful and full of features, but you may never see them. Google are also working with Samsung to develop Smart Glasses, and these may take precedence. Stay tuned to get the latest updates on the Google developments.

Smart Glasses are becoming a massive market, expected to be worth around $100 billion by the end of 2026. The first iterations of this tech have demonstrated what is possible in such small packages, and how they can be built upon, these devices have the potential to replace today’s XR headsets, and take the virtual world out into the real one at an affordable cost.  

 

We predict that this growth in wearable tech will fuel the development of Apps to run on them, covering everything from gaming and entertainment to practical features such as way-faring and translation.  If you have an XR project that you need help with, contact us at Unity Developers and see how we can make it a success.