Jobs by JobLookup

6 new Gemini Live features Google announced at I/O 2025

 


Google unveiled Project Astra last year at I/O, showing off nascent AI technology that allows mobile users to talk to Google’s AI in real time using conversational language. You might ask the AI to find stuff on the web for you or share your camera and screen so it can see what you see and provide guidance.

Some of those features are available via Gemini Live, Google’s AI-powered assistant for Android and iPhone. But Google isn’t stopping there. It announced several new Project Astra tricks coming to Gemini Live soon, in addition to making its best feature free for Android and iPhone users.

Camera and screen-sharing go free

Having the AI look at what you see in real life or on your screen is a key function for any AI-powered assistant. Google wants Gemini to be more powerful than ever, so it’s making the camera and screen-sharing features in Gemini Live free for all Android and iPhone users. The feature will start rolling out on Tuesday.

Gemini Live features.
Gemini Live features. Image source: Google

Gemini Live will also integrate with more Google apps soon, starting with Google Maps, Calendar, Tasks, and Keep.

Finding manuals online and scrolling for information

In the video, the user asks Gemini Live to look for the manual of the bike he’s repairing. The AI browses the web, finds the document, and asks what the user wants to see next.

The user then tells Gemini Live to scroll the document until it finds a section about brakes. The Android phone’s screen shows Gemini Live doing just that and finding the information.

This kind of agentic behavior suggests Gemini Live will be able to access specific information online, even within documents.

Find the right YouTube clip

The user then asks Gemini Live to find a YouTube video that shows how to deal with a stripped screw. Gemini delivers.

Look for information online while looking at your camera

The chat continues with the user asking Gemini Live to find information in Gmail about the hex nut size he needs, while showing the AI an image of the available parts in his garage.

Gemini Live: Camera-sharing and multimodal functionality.
Gemini Live: Camera-sharing and multimodal functionality. Image source: Google

Gemini Live surfaces the information and highlights the correct part in the live video feed. It’s a mind-blowing feature to have at your fingertips.

Making calls on your behalf in the background

Next, the user asks Gemini Live to find the nearest bike shop and call them to ask about a specific part.

The AI doesn’t respond immediately, since the call involves a third party. But Gemini Live tells the user it will follow up with the info once it has it.

Gemini Live: Making a call on behalf of the user.
Gemini Live: Making a call on behalf of the user. Image source: Google

The user keeps talking to Gemini Live while the AI handles the call in the background.

Once the call is done, Gemini Live provides the needed info while continuing to manage other tasks in parallel.

Handling multiple speakers without losing focus

While Gemini Live is on the phone with the bike shop, the user asks a follow-up question about the manual. At the same time, someone else asks the user if they want lunch.

Gemini Live pauses but keeps track of everything. Once the user replies to the lunch question, the AI resumes the conversation about the manual without missing a beat.

Context-aware online shopping

At the end of the clip, the user asks Gemini Live to find dog baskets for his bike. The AI surfaces suggestions that would fit his dog, clearly recognizing the pet from Google Photos or past interactions.

Gemini Live can also make purchases, likely through Project Mariner. While we don’t see this in action, when the AI confirms the bike shop has the needed tension screws, it offers to place a pickup order.

These new Gemini Live features won’t roll out immediately. Google is collecting feedback from trusted users first. Once ready, the features will be available on mobile devices and Android XR wearables.

As a longtime ChatGPT user, I can already say I envy these capabilities. Hopefully, OpenAI is working on something similar.

Google is taking its virtual try-on feature to a new level. Instead of seeing what a piece of clothing might look like on a wide range of models, it’s now testing a feature that lets you upload a photo of yourself to see how it might look on you.

The new feature is rolling out in Search Labs in the US today. Once you opt into the experiment, you can check it out by selecting the “try it on” button next to pants, shirts, dresses, and skirts that appear in Google’s search results. Google will then ask for a full-length photo, which the company will use to generate an image of you wearing the piece of clothing you’re shopping for. You can save and share the images.

GIF: Google

Google says the feature uses an AI model that “understands the human body and nuances of clothing — like how different materials fold, stretch, and drape on different bodies.”

Google will also soon allow you to shop in AI Mode — the new, Gemini-powered search experience it began rolling out to more users in March. If you tell AI Mode that you’re looking for a new travel bag, for example, it will automatically show a personalized panel of images and product listings.

AI Mode will surface a personalized list of product results.
AI Mode will surface a personalized list of product results.
 Image: Google

You can narrow the selection by providing more details about your needs, like noting you need a bag for a May trip to Portland, Oregon. Google says AI Mode will conduct multiple searches simultaneously, allowing it to determine what features are suitable for rainy weather and then surface results that meet those needs, such as having waterproof fabric and more pockets.

The new “Buy for me” button will let Google complete checkout for you.
The new “Buy for me” button will let Google complete checkout for you.
 Image: Google

There’s also a new “agentic” checkout feature rolling out to Google users in the US in the coming months. Right now, if you tap “track price” on a product listing, select a size, color, and the amount you want to spend, Google will automatically notify you when it drops to your preferred price. A new feature lets you confirm your purchase details and then select “buy for me.” Google will then check out on the merchant’s website and “securely complete the checkout on your behalf” using Google Pay.

Post a Comment

Previous Post Next Post