Advertisement
Home NEWS Real Life

The chat trap: Why viral AI trends aren’t as harmless as they look

Dr Joanne Orlando tells WHO about the hidden costs of participating in online fun.
Chat GTP

Just a few short weeks ago, social media feeds were overflowing with side-by-side photos comparing users today with their 2016 selves.

Advertisement

Now, the internet has already leapt to its next viral distraction: AI-generated caricatures created from whatever information ChatGPT can gather about a person, paired with a supplied photo.

It all feels lighthearted and fun, but do these artificial intelligence trends come with a hidden cost?

What are you giving away when you share a snap with AI?

“There are a couple of these trends floating around right now and they all seem to have one thing in common,” digital wellbeing researcher Dr Joanne Orlando tells WHO. “They entice us to hand over images and information we wouldn’t normally share, which is gold to an AI company.”

We’ve seen similar crazes before, such as the 2025 Barbie doll transformations and Ghibli-style Japanese animation filters. But the stakes are rising.

Advertisement
AI caricature of Kylie Walters
WHO’s senior writer and royal expert Kylie Walters made her work caricature for research. (Credit: CHAT GTP )

The images and personal details we provide, Orlando explains, can be used for a wide range of purposes. While some are harmless, others are far more concerning.

“They can sell it for marketing purposes,” she says. “It’s also used to train AI and facial recognition systems, which are developing so quickly that your data could be used for something in five years’ time that doesn’t even exist yet – and you wouldn’t necessarily consent to it, but you’ve already handed it over.”

In more extreme cases, she warns, this could even extend to “criminal activity”, such as your likeness being used as “the face of a scam” without your knowledge.

Advertisement

Orlando points to another recent trend that asked people to scan their facial biometrics to receive a rating on how attractive they are.

Miranda Kerr
The trend which saw celebrities such as Miranda Kerr share a snap from 2016 helped show AI how we age. (Credit: Miranda Kerr Instagram )

“Your facial biometrics are as important and unique as your fingerprints,” she says. “In a world where devices and programs are increasingly accessed through biometrics, imagine what could happen if yours fall into the wrong hands.”

It’s no surprise, then, that public trust in AI is slipping.

Advertisement

Roy Morgan research from October 2025 found that 65 per cent of Australians believe AI “creates more problems than it solves”.

Yet, that hasn’t stopped us from joining in.

Trends can spread very quickly, often accompanied by captions and easy links urging others to post their own. And that’s part of the appeal.

“It’s a fun way to play as an adult, which is why they take off,” Orlando says. “But AI can be just as addictive as social media because it keeps prompting you back. If spending too much time on Instagram or Facebook is a problem for you, you’ll likely have the same issue with AI platforms.”

Advertisement

Digital sociologist Dr Jessamy Perriam also understands the draw.

AI shot of Kim and Lewis
Soon, AI will be as good at making fake shot of normal people as it is with celebrities such as Kim Kardashian and Lewis Hamilton, with this AI generated image fooling several media organisations.

“With this particular trend, the prompt is to create a caricature based on everything that artificial intelligence knows about a person,” she tells WHO. “People are always curious about how others perceive them. It puts a fun twist on that idea.”

The senior lecturer in the School of Cybernetics at The Australian National University says there are easy steps we can take to stay safe online.

Advertisement

“When you’re handing over any kind of personal information to AI, ask yourself whether you’d be comfortable sharing it in an open online forum. If not, you shouldn’t be giving it,” she says.

“And when it comes to photos, make sure there’s no sensitive information in the background that could reveal your location, workplace or school. If it’s taken at work and you’re a doctor or psychologist, for example, you don’t want a stack of files visible with patient details.”

As AI becomes more advanced, we’ve seen an explosion of seemingly realistic but entirely fake images of celebrities doing the rounds online.

By sharing our own photos, everyday people could become just as vulnerable to having their likeness manipulated.

Advertisement

“Theoretically, after you’ve shared your image and identifying information, anyone could jump onto AI and ask for an image of you in a compromising position,” Perriam explains.

“It’s OK to have fun, but we need to be aware of the risks and have an idea of what we are willing to accept first.”

Related stories


Advertisement
Advertisement