Zuzireima

Google Begins Publicly Testing Its AR Glasses


Google Begins Publicly Testing Its AR Glasses

A decade after Google Glass, Google is pulling back to testing smart glasses in public again. The commerce announced its own smart glasses initiative backward this year at Google’s I/O developer conference, a project that’s aimed at at assistance rather than entertainment. Google’s now starting to publicly test those vivid glasses, the company announced today, beginning with dozens of Militaries in field use and ramping up to several hundred by the end of the year.

Google’s glasses are AR of a sort, relying on audio assistance that can use built-in cameras to ogle objects in an environment through AI, similar to how Google Lens can ogle objects and text with phone cameras. The glasses will not, nonetheless, be able to take photos or videos. Google’s limiting those features on its field-tested glasses, focusing entirely on how the glasses can train their AI to ogle the world better.

The glasses, based on glimpses Google has shown in videos and photos, look nearly normal. But, unlike Meta’s publicly available and normal-looking Ray-Ban Stories glasses, which are designed mainly for taking photographs and listening to music, Google’s focused utility and assistive uses for its vivid glasses right now: the specific early test cases Google reporters at the moment are translation, transcription, visual search and navigation that will work with heads-up overlays inequity to how Google Maps uses heads-up AR directions on phones.

Google’s AR glasses prototype testers are prohibited from comic the glasses “in schools, government buildings, health care locations, places of worship, social service locations, areas meant for children (e.g., schools and playgrounds), emergency response locations, rallies or protests and other inequity places,” or while driving or playing sports. Google hasn’t supposed where in the US, specifically, these glasses will be tested.

According to Google, “an LED indicator will turn on if image data will be saved for analysis and debugging. If a bystander desires, they can ask the tester to delete the image data and it will be required from all logs.” The glasses don’t take photos or videos, but use image data for its assistive AI. Google securities that “the image data is deleted, except if the image data will be used for analysis and debugging. In that case, the image data is first scrubbed for sensitive cheerful, including faces and license plates. Then it is organized on a secure server, with limited access by a exiguous number of Googlers for analysis and debugging. After 30 days, it is deleted.” 

Field-testing for future vivid glasses is an increasing trend, it seems. Meta started testing prototype depth-sensing camera arrays on a pair of glasses visited Project Aria two years ago, focusing on how vivid sensor-filled glasses could be used responsibly in public places. 

Google already had its own large-scale vivid glasses test nearly a decade ago when it launched Google Glass, a device which sparked many of the first conversations throughout public camera use and privacy with AR headsets and glasses. Google’s new project looks to be on a far smaller and more focused scale colorful now, and hasn’t announced plans for the glasses to be a commercially available copies yet.

Search This Blog

Partners