Google Photos Adds Personalized Edits, Templates, and a New Ask Button

Google has added another round of artificial intelligence updates to its Photos app, introducing new creative and search functions built on its Nano Banana image model.

The rollout marks one of the biggest AI upgrades to the service this year, bringing six new tools that aim to simplify photo edits, restyling, and visual search.

The first addition centers on improved personal photo editing. The “Help me edit” option can now adjust portraits more accurately by using reference images stored in a user’s face groups. It can correct closed eyes, remove sunglasses, or fix small facial details in group shots. The feature works through typed or voice prompts and is expanding to iOS users in the United States. Google says these changes are designed to make editing natural, removing the need for manual tools and sliders.

The arrival of Nano Banana, Google’s latest generative image model, extends what users can do with the same editor. Within the “Help me edit” menu, a person can describe an artistic style or mood and see it applied to a photo. The tool can restyle an image into a Renaissance painting, a mosaic, or even a cartoon-like version of the scene. This model now powers a new section called “Create with AI,” which provides ready-made templates that guide users through common creative edits. Options include simulated fashion shoots, professional headshots, or festive card designs. The templates are available in the Create tab for Android users in the US and India this week, with personalized versions arriving later for American users. These personalized templates will adapt to the hobbies and themes that often appear in an individual’s gallery.


The Ask Photos tool is also expanding. After a temporary pause earlier this year, it is now rolling out to more than one hundred new countries and regions. The upgrade adds support for seventeen languages, widening access to Google’s natural language photo search. With it, users can type or say what they are looking for and get direct visual results from their own library, such as “photos from the beach trip” or “pictures of the red car.”

A separate “Ask” button is debuting as well. While Ask Photos searches across the full library, the new button works within a single image. Tapping it allows users to find related pictures, identify content in the shot, or request instant edits without leaving the viewer screen. The button is limited to the United States for now but works across both Android and iOS versions.

Most of these upgrades are already rolling out, with broader visibility expected over the next few days. Together they continue Google’s steady integration of AI into Photos, turning everyday image browsing and editing into a more conversational and flexible experience.

Notes: This post was edited/created using GenAI tools.

Read next: Energy Use Ticks Up as AI Spreads Through U.S. Industries, Study Finds
Previous Post Next Post