A popular social media fad using Google’s Gemini Nano AI to transform everyday snapshots into retro saree portraits has taken an unsettling detour, prompting urgent cautions about the perils of digital oversharing.
The trend, which has captivated women eager to reimagine themselves in classic Indian attire, involves feeding personal photos into the AI tool alongside tailored prompts. What begins as a lighthearted creative outlet quickly veers into privacy territory, as users unwittingly expose intimate details to vast online ecosystems.
This shift came into sharp focus through a recent Instagram post by an unnamed influencer, who recounted her unnerving brush with the technology. In a candid video, she described submitting a photo of herself clad in ethnic wear, instructing Gemini Nano to craft a version featuring a saree and sleeveless blouse. The output initially delighted her—a polished, AI-enhanced portrait that she promptly disseminated across platforms.
That enthusiasm evaporated upon closer inspection. Tucked into the generated image was an unprompted feature: a mole on her left hand, absent from the source photo yet unmistakably hers. “How does Gemini know I have a mole in this part of my body? You can see this mole… this is very scary, very creepy… I am still not sure how this happened. I wanted to share this information with all of you. Please be safe… whatever you’re uploading on social media or AI platforms,” she cautioned in the clip.
The video exploded online, racking up more than seven million views and igniting a flurry of responses. Commenters traded similar anecdotes, piecing together the mechanics at play. One observer noted, “Everything is connected, Gemini belongs to Google, Google reads all the photos you have kept in your Gmail-photos-drive and the posts and videos you have on social media will already be available in the database.” Others chimed in with parallels: “Yes I have noticed same things,” confessed one, while another revealed, “This happened to me too. My tattoos which are not visible in my photos were also visible. I don’t know how but it was happening.”
A more analytical voice cut through the speculation: “Well, that is exactly how AI works. AI draws information from your digital footprint, from all the images that you have been uploading online. So, when you ask AI to generate an image, it is also going to use your uploads from the past. Your mole is visible in your other pictures. It is good to be careful. AI is not really the problem, oversharing of our own information is the problem.”
The incident resonated beyond casual users, drawing intervention from authorities. IPS officer VC Sajjanar took to X, formerly Twitter, to sound a stark alert on the “Nano Banana” craze—slang for the Gemini Nano banana saree phenomenon. “Be cautious with trending topics on the internet! Falling into the trap of the ‘Nano Banana’ trending craze… if you share personal information online, such scams are bound to happen. With just one click, the money in your bank accounts can end up in the hands of criminals,” he posted.
Sajjanar stressed restraint: “Never share photos or personal details with fake websites or unauthorised apps. You can share your joyful moments on social media trends, but don’t forget that safety should be your top priority.” He underscored the lasting repercussions, adding, “These trends come and make a fuss for a few days before disappearing… Once your data goes to fake websites or unauthorised apps, retrieving it is difficult.” To amplify his message, he looped in key entities: the Prime Minister’s office, the Indian Cybercrime Coordination Centre, the Ministry of Information and Broadcasting, and the Telangana Police.
Echoing these concerns, finance expert Bhanu Pathak released a video dissecting the underbelly of AI image uploads. He warned that surrendering a photo strips away user oversight. “Even if platforms promise security, leaks and breaches are always possible. Private images can be misused for identity theft, impersonation, or worse,” Pathak stated.
As the “Nano Banana” wave persists, it serves as a sobering reminder: behind the allure of AI-driven artistry lurks the shadow of unchecked data trails, urging a recalibration of what we share in pursuit of viral whimsy