Google is taking Nano Bananaits image editing and generation model, beyond the Gemini application. Starting today, it begins to be integrated into Google Lens and Search AI mode. Additionally, it adds new visual styles to NotebookLM. The company also confirms that Google Photos will receive it in the coming months. What does this mean for the Android user? Fast, contextual, natural language edits, directly from the tools you already use daily.
Until now, Nano Banana was primarily used within Gemini to generate and retouch images with text instructions. Google's move brings it closer to a much wider audience: just open the Google app on Android or iOS, enter Lens and tap the new “Create” mode (banana icon) to apply changes to your photos or a recent capture. It is a relevant step because it enables precise editions, maintaining a coherent style and allowing progressive improvements without leaving the search engine.
Nano Banana in Google Lens and AI Search mode
Lens integration focuses on editing more immediate: can take a photo and describe the transformation (for example, “change the sky to sunset” or “remove the cables”), chain follow up instructions and share the result. In Search AI mode, Nano Banana also allows generate images to illustrate queries either refine results with visual proposals better adapted to what you have asked for.
Another practical detail is the low learning curve: Google will accompany the “Create” button with suggested prompts and examples to get started. In addition, the model maintains the improvements that we already saw in Gemini, such as the coherence of characters and objects between editions and region-led tweaks of the image, something key to not “recreate” the photo from scratch when you only want a fine adjustment.
NotebookLM adds more visual styles and videos

The update also comes to NotebookLMthe studio assistant. With Nano Banana, «Video Overviews» incorporate more visual styles (watercolor, comic, among others) to summarize notes and documents with short, illustrated videos that summarize the contents. For those who use NotebookLM to support their study or work, this translates into clearer and more memorable summarieswith a visual finish closer to what you want to convey.
In parallel, Google has announced that Google Photos will integrate Nano Banana “in the coming months.” If implemented as in Lens, hopefully many quick edits can be made without leaving your gallery, relying on natural language and additional controls to refine the result.
What changes for you on Android
The most tangible novelty is that you will not need to open Gemini To use Nano Banana: Lens and Search They will serve as the entrance door. Also relevant is the staggered availability by languages and countriesso arrival in Spain could take a few weeks compared to the United States and India. Be that as it may, the direction is clear: lead the AI editing at the center of the experiences we already have on mobile, without friction and with results that, in general, better fit what the user wants to achieve.
Are you convinced by the editing with AI integrated in Search and Lens?
That Nano Banana leaves the Gemini “sandbox” and reaches Google's massive apps is a strong commitment to comfort. If you use Android on a daily basis, you'll likely start seeing the banana icon in Lens and use it for quick changes. Do you find this integration useful or do you prefer to continue editing in dedicated apps like Google Photos or classic editors? We read you in the comments.






