2 mins read

Alert: ‘Hi Mum’ WhatsApp Scam Uses AI Voice Mimicry to Target Parents, Stealing Millions

Beware the "Hi Mum" WhatsApp Scam: AI Voice Clones Fuel Fraud

For millions, WhatsApp is a lifeline to loved ones worldwide. But cybersecurity experts warn of a rising "Hi Mum" scam, which has swindled victims of nearly £500,000 since early 2024. Here’s what you need to know.

How the Scam Works

Fraudsters pose as family members, often starting with messages like "Hi Mum" or "Hi Dad," claiming they’ve lost their phone and need urgent financial help. They may request money for a new phone, rent, or emergencies. To amplify believability, AI voice cloning is now used to mimic a loved one’s voice.

Scam Alert
AI voice cloning makes scams harder to detect, as fraudsters replicate voices from social media clips.

Key Tactics

  • Impersonation: Scammers use social media intel to pose as children, parents, or close friends.
  • Urgency: They pressure victims to act fast, often citing emergencies.
  • Fake Bank Details: Money is requested via unfamiliar accounts, not the impersonated person’s usual one.

Alarming Statistics

Santander reports that 506 cases were reported in 2024, costing victims £490,000. The most successful scams impersonate sons, followed by daughters and mothers.

How to Spot the Scam

  1. Unexpected Requests: Be wary of messages from unknown numbers claiming to be family.
  2. Urgent Money Requests: Scammers push for immediate transfers to new accounts.
  3. Voice Clues: If a voice note sounds off, verify through a direct call.

Example Messages
Scam messages often start with a plea for help, followed by urgent financial requests.

Protect Yourself

  • Verify: Call the person directly using a saved number.
  • Set a Code Word: Create a family emergency code to confirm identities.
  • Report: Forward suspicious messages to 7726 (UK) or report via WhatsApp.

If You’re Targeted

  • Don’t Panic: Pause and verify the request.
  • Contact Your Bank: They may block fraudulent transactions.
  • Update Security: Use two-factor authentication on WhatsApp.

Prevention Tips
Always verify unusual requests through a separate communication channel.

Final Thoughts

As AI tools become accessible, scams grow more sophisticated. Stay vigilant, question unexpected messages, and protect your family with proactive measures.

For more details on reporting scams, visit Action Fraud or your local cybercrime authority.


Images: Example images show scam messages, AI voice tech, and security steps. Credit: Getty Images/Example Source.

(Word count: ~600)

Leave a Reply

Your email address will not be published. Required fields are marked *