Google I/O 2024: New AI Assistant Can Read Everything Through Your Phone’s Camera


Last Updated:

Google demoed its new AI Gemini assistant through camera

Google demoed its new AI Gemini assistant through camera

Google I/O 2024 has been mostly about the AI advancement from the company and its new-gen Gemini AI assistant coming soon.

Google I/O 2024 keynote was surprise, surprise, all about AI. The company has a lot of catching up to do with OpenAI now taking ChatGPT to the 4o version earlier this week. The I/O 2024 keynote has shown us the work Google has been doing behind the scenes with the help of the Google Deepmind AI team.

And one of the products to roll out from the lab is called Project Astra, which is a new-gen AI assistant that promises to integrate AI into mobile devices with spatial understanding and video processing to give you accurate information.

Project Astra – The Everyday AI Assistant

The multimodal from Google is based on Gemini is basically its way of saying to OpenAI that we are here for the battle. So, how does this version of the AI Assistant work? Google is using the camera on your phone to guide the AI assistant to help you understand the things near you.

It can even read a code written on a PC and help you determine its purpose or solve the complex code as well. That’s not all, you can point the camera to the street and ask the AI assistant to tell you where you are located and get more localised details if indeed.

Google says these capabilities via Project Astra will be available in Gemini Live within the main app that will be available later this year. The tech will initially work on the Pixel phones and Google sees the AI assistant coming to more devices, which includes the smart glasses and even TWS earbuds some day.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *