Ray-Ban Meta Smart Glasses Receive Groundbreaking AI Features in the UK
Ray-Ban’s smart glasses have taken a significant technological leap with the arrival of a highly anticipated AI upgrade in the United Kingdom. Developed in partnership with Meta, these eyewear devices now offer several advanced features aimed at transforming everyday experiences and travel alike, positioning themselves competitively against similar offerings from tech giants such as Samsung.
The most prominent addition to the smart glasses is a feature called “Look and Ask,” which enables users to receive real-time information simply by looking at an object and asking about it. Once enabled, the glasses can respond to voice prompts like “Hey Meta, what am I looking at?” and deliver contextual information based on what the user sees. For instance, someone standing before a landmark such as Big Ben could receive a detailed description of its history and surroundings without needing to consult a phone or guidebook.
This innovation reflects Meta’s broader ambition to integrate artificial intelligence into everyday scenarios, facilitating seamless interactions with the physical world. The company provided examples, including walks through parks where users can identify different types of trees, or city tours where monuments and architecture are explained through voice responses. The aim is to eliminate the guesswork in unfamiliar environments and make spontaneous learning more intuitive.
Another standout feature in the new upgrade caters to global travellers. The glasses can now translate written foreign-language text into English, offering assistance when navigating foreign menus, reading signs, or understanding labels. According to Meta, this functionality will soon expand to include real-time speech translation. Initially, support will be available for English, Spanish, Italian, and French, with plans for additional languages in future updates.
This capability allows for two-way communication between speakers of different languages. Conversations can be translated audibly for the wearer, while their replies are translated and displayed in the Meta View app for the other party. This approach is designed not only to simplify travel but also to foster human connection across language divides, making the technology relevant beyond tourism and into daily communication scenarios.
Beyond translation and sightseeing, the AI upgrade extends practical utility in numerous everyday contexts. For those visiting museums, walking in nature, or exploring new neighbourhoods, the glasses offer the possibility of learning more about the surrounding environment on the go. Meta suggests the features are also well suited to identifying plant species, exploring artwork, and gaining real-time insight into one’s surroundings, enhancing both leisure and educational experiences.
The new tools can be activated by navigating to the Meta View app’s settings menu and enabling the Meta AI option. Once enabled, users can interact with the glasses by speaking naturally, asking about objects, locations, or written text. The hands-free nature of the system ensures accessibility without the need to break focus or pull out a smartphone.
According to Meta, this release marks only the beginning of their vision for AI-enhanced wearables. The company has outlined plans to expand the functionality across more countries, improve the depth of object and speech recognition, and introduce a broader array of supported languages. Such developments hint at a future where AI-enabled glasses could become indispensable tools for learning, exploration, and communication.
While the future trajectory of Meta’s smart glasses remains to be seen, the current upgrade clearly signifies a meaningful step forward. With enhanced AI features now available in the UK, Ray-Ban Meta glasses are set to be more than just a fashion statement—they are evolving into sophisticated tools for real-time information and language accessibility. Whether used for international travel, local discovery, or daily learning, the glasses offer a glimpse into a more interactive and intelligent future.