Rolling Out

AI systems now capable of finding lost eyeglasses

AI systems are now able to find lost eyeglasses.


A demonstration by Google has shown a prototype artificial intelligence system running on a phone that was able to answer the question: “Where did I put my glasses?”


The system, which had the ability to tap into a phone’s photo and video library, told the user it had “seen” them on a nearby desk.

There was also a live demo of the AI using video when searching Google.


It comes a day after OpenAI launched its latest AI system, GPT-4o, which included a presentation in which it read human expressions via a phone camera, and chatted — as well as flirted.

Google teased the capability of its systems running on a phone ahead of OpenAI’s announcement.

The firm showcased multimodal features in Gemini Nano, an AI assistant that runs on its Pixel phone, and in the Gemini app.

It also demonstrated a prototype scam alert feature it was testing for Gemini Nano that could listen to a phone call and warn that it was fraudulent.

The new AI powered demos were revealed at Google I/O, the firm’s annual presentation for software developers.

Speakers included Sir Demis Hassabis, head of Google Deepmind, who stressed the company’s AI is now able to handle images, video and sounds and draw connections between them.

There was also a demonstration of a prototype system that would create a virtual “teammate” for users, with the ability to perform tasks such as attending multiple online meetings at once.

Sir Demis showcased it as part of project Astra, which is exploring the future of AI assistants.

It was shown to be able to answer spoken questions about what it was seeing through a phone camera.

Leave a Reply

Your email address will not be published. Required fields are marked *

Join our Newsletter

Sign up for Rolling Out news straight to your inbox.

Read more about:
Also read
Rolling Out