Jobs by JobLookup

Google focuses on eyewear, AI





Google just dropped a new AI shopping experience that's going to bypass social. Here's what you need to know:

At Google I/O 2025, the company just announced its virtual try-on technology that could change how people shop for clothes online. By uploading a photo of yourself, you can now see how different outfits look on your own body.

The “try-on” tool is rolling out today in the US via Search Labs, allowing you to access 50 billion apparel listings. When you search for clothing on Google, look for the “Try it on” icon next to supported products.

My first thought? This is like Cher's closet in Clueless!

My second thought? If people can see how clothes look on them just by uploading a photo to Google, why would they need to scroll through brand socials or UGC?

We all keep citing that stat: 45% of Gen Z prefers social search over Google. And that’s still true. Social plays a huge role in discovery. The customer journey has changed.

And yet... a lot of people still use search engines. And if Google starts to personalize the experience, pulling directly from retailers, that could essentially sideline the role of social.

** It's important to note ** There is no social integration at this time. If someone wants to try on the latest Gap x DÔEN Collection, it's not pulling from Instagram or TikTok. It's pulling directly from retailer websites.

My POV: This is a shopper-first feature and, if done right, could be a great experience.

But brands need to pay attention. If this takes off, it prioritizes keeping Google Merchant Center up to date with new products, accurate prices, and high-fidelity images.

And it could reduce the need for UGC. If shoppers can see how clothes look on them, they may not care how they look on random people.

Would this change your shopping habits? Would you skip TikTok reviews and go straight to search?

 Google has begun rolling out AI Mode to every US search user. The company announced the expansion during its I/O 2025 conference. Google began previewing AI Mode with testers in its Labs program at the start of March. Since then, it has been gradually rolling out the feature to more people, including, in recent weeks, regular Search users. At its keynote today, Google shared several updates coming to AI Mode as well, including some new tools for shopping, as well as the ability to compare ticket prices for you and create custom charts and graphs for queries on finance and sports.

For the uninitiated, AI Mode is a chatbot built directly into Google Search. It lives in a separate tab and was designed by the company to tackle more complicated queries than people have historically used its search engine to answer. For instance, you can use AI Mode to generate a comparison between different fitness trackers. Before today, the chatbot was powered by Gemini 2.0. Now it's running a custom version of Gemini 2.5. What's more, Google plans to bring many of AI Mode's capabilities to other parts of the Search experience.

"AI Mode is where we'll first bring Gemini's frontier capabilities, and it's also a glimpse of what's to come," the company wrote in a blog post published during the event. "As we get feedback, we'll graduate many features and capabilities from AI Mode right into the core search experience in AI Overviews."

Looking to the future, Google plans to bring Deep Search, an offshoot of its Deep Research mode, to AI Mode. Google was among the first companies to debut the tool in December. Since then, most AI companies, including OpenAI, have gone on to offer their take on Deep Research, which you can use to prompt Gemini and other chatbots to take extra time to create a comprehensive report on a subject. With today's announcement, Google is making the tool available in a place where more of its users are likely to encounter it.

Another new feature that's coming to AI Mode builds on the work Google did with Project Mariner, the web-surfing AI agent the company began previewing with "trusted testers" at the end of last year. This addition gives AI Mode the ability to complete tasks for you on the web. For example, you can ask it to find two affordable tickets for the next MLB game in your city. AI Mode will compare "hundreds of potential" tickets for you and return with a few of the best options. From there, you can complete a purchase without having done the comparison work yourself.

"This will start with event tickets, restaurant reservations, and local appointments," says Google. "And we'll be working with companies like Ticketmaster, StubHub, Resy, and Vagaro to create a seamless and helpful experience."

AI Mode will also soon include the ability to generate custom charts and graphics tailored to your specific queries. At the same time, AI Mode will be more personalized shortly, with Google introducing an optional feature allowing the tool to draw on their past searches. The company will also give people the option to connect their other Google apps to AI Mode, starting with Gmail, for even more granular recommendations.

As mentioned above, Google is adding a suite of shopping features to AI Mode. Engadget has a separate post dedicated to the Shopping features Google announced today, but the short of it is that AI Mode will be able to narrow down products for you and complete purchases on your behalf, with your permission, of course.

All of the new AI Mode features Google previewed today will be available to Labs users first before they roll out more broadly.

Hello from Google I/O! Just got off the stage here at Shoreline Amphitheatre in Mountain View, where we shared how decades of AI research are becoming reality for people around the world. A few of the things we announced:

- AI Mode in Search is now rolling out to everyone in the US, expanding on what AI Overviews can do.

- An improved Gemini 2.5 Flash and an enhanced Deep Think mode are coming to Gemini 2.5 Pro.

- Veo 3 is state of the art, and with a new filmmaking tool, Flow, we’re combining the best of Veo, Image,and Gemini.

- A new agent mode in the Gemini app can help you get more done across the web.

- New real-time translation is coming to Google Meet, and we introduced an AI-first video platform for 3D experiences called Google Beam.

- Personalized smart replies are coming to Gmail

- A glimpse into the future with XR glasses

We also shared some of the incredible momentum we’re seeing with our models, the rapid adoption of AI across our products, and much more.

🤯 GOOGLE LAUNCHED AI MODE 🤯 It’s going to have a huge impact on the way we all think about search because contextual search is going to get a whole lot more personalized…

Today, the search experience for someone looking something up varies from postal code to postal code, but with this type of feature, it’s going to vary from individual to individual. Imagine:

Google knows you don’t like Onions, so you never get recipe results and links that include onions.

Google knows your itinerary for your trip to New York, so it gives you “things to do” that are near your hotel and based on other things you’ve enjoyed via Google reviews.

Google knows you’re getting married in the fall, so when you look for a photographer, it prioritizes finding ones who have reviews about weddings.

It gets better, though:

Imagine it’s connected to your calendar and when you type in “places to eat,” it knows where you are and how fast the meal needs to be made…

I don’t know how much of this will be possible in 2025 but I do know that personalized search is going to make the process of tracking what ranks and doesn’t rank in the SERP a lot more difficult.

This is going to mean:

Distribution matters more than ever.

Brand matters more than ever.

Storytelling on channels that go beyond Google for marketers will matter MORE than ever.

No…

I don’t think SEO is dead.

But yes… I’m confident that the strategies that have worked in the past are getting close to their expiration date.

Some marketers are freaking out.

I get it.

But this is the industry…

Embrace change.

Embrace the fundamentals.

And remember that content marketing is a two-word industry. It’s not just about creating mediocre content with AI tools.

It’s about creating content worth marketing.

Google announced on Tuesday during Google I/O 2025 that Project Astra — the company’s low-latency, multimodal AI experience — will power an array of new experiences in Search, the Gemini AI app, and products from third-party developers.

Most notably, Project Astra is powering a new Search Live feature in Google Search. When using AI Mode, Google’s AI-powered search feature, or Lens, the company’s visual search feature, users can click the “Live” button to ask questions about what they’re seeing through their smartphone’s camera. Project Astra streams live video and audio into an AI model, and responds with answers to users’ questions with little to no latency.

First unveiled at Google I/O 2024 through a viral smart glasses demo, Project Astra was born out of Google DeepMind as a way to showcase nearly real-time, multimodal AI capabilities. Google now says it’s building those Project Astra glasses with partners including Samsung and Warby Parker, but the company doesn’t have a set launch date yet. What the company does have is a variety of Project Astra-powered features for consumers and developers.

Google says Project Astra is powering a new feature in its Live API, a developer-facing endpoint that enables low-latency voice interactions with Gemini. Starting Tuesday, developers can build experiences that support audio and visual input, and native audio output, much like Project Astra. Google says the updated Live API also has enhanced emotion detection, meaning the AI model will respond more appropriately, and includes thinking capabilities from Gemini’s reasoning AI models.

In the Gemini app, Google says Project Astra’s real-time video and screen-sharing capabilities are coming to all users. While Project Astra already powers Gemini Live’s low-latency conversations, this visual input was previously reserved for paid subscribers.

Google seems confident that Project Astra is the future for many of its products, and can even power an entirely new product category: smart glasses. While that may be true, Google still hasn’t set a launch date for the Project Astra smart glasses it demoed last year. The company has offered a few more details on what those smart glasses will look like, but they still seem far from reality.

Post a Comment

Previous Post Next Post