Deepfake Scam Alert: How to Spot and Avoid AI-Cloned Voice Calls

Deepfake scams typically begin with fraudsters recording a person’s voice from social media or phone calls. (Image source: @UPI_NPCI)

Deepfake scams typically begin with fraudsters recording a person’s voice from social media or phone calls. (Image source: @UPI_NPCI)

A new wave of financial frauds using deepfake technology is raising alarms. The Unified Payments Interface (UPI) has urged citizens to stay cautious and informed of deepfake Scams. The warning comes in response to a rise in scams where fraudsters use artificial intelligence (AI) to clone voices and faces, tricking victims into thinking a loved one is in trouble and urgently needs money.

In a recent tweet, UPI highlighted how one phone call—featuring a familiar voice and a panicked plea for help—can deceive even the most alert individuals. But behind the voice could be a scammer using AI to mimic someone you trust. “One call can trigger panic: a loved one in trouble, a voice that sounds real, and an urgent request to send money online. But it could all be fake. Learn the signs. Don’t let financial frauds play on your emotions,” the tweet read.

These deepfake scams typically begin with fraudsters recording a person’s voice from social media or phone calls. They then use AI tools to replicate that voice and sometimes even simulate the person’s face. Impersonating police or family members, the scammers claim there’s an emergency and request immediate financial help. Victims are often caught off guard by the familiar voice and pressured into transferring money without verifying the situation.

To help the public identify and avoid such scams, UPI has released clear instructions: if a call feels suspicious, hang up immediately. Verify the caller’s identity by calling them directly through a known number. Look out for robotic tones, unnatural pauses, or vague details in the conversation. Asking personal questions that only your real contacts can answer is another way to confirm the authenticity of the call.

In the event of receiving such a fraudulent call, citizens are urged to report the incident by dialing 1930, the official helpline for reporting cyber frauds.

As AI-driven frauds become more sophisticated, the public is encouraged to stay informed, share safety tips with loved ones, and think twice before acting on emotional requests received via phone or digital platforms.

Exit mobile version