Everything You Need to Know about AI Voice Scam

In the latest AI-related scam, scammers are using AI voice cloning tools to sound like your beloved family members, friends, and even customer service representatives & government officials to trick t

Friends, along with technology, the ways of scamming people are also progressing and Artificial intelligence (AI) technology is the latest fuel in the rising fire of online scams. ISH has released videos on several AI scams. Watch them here - https://www.youtube.com/watch?v=hAA3IM-G5yk & https://www.youtube.com/watch?v=OZmWvgnCr_M Scammers are now using AI to mimic the voices of loved ones in distress and scam money of innocent people. AI is capable of cloning anyone’s voice with just three seconds of audio of that person’s voice. The scammers will often use this technique to pose as family members, friends, or even customer service representatives & government officials to trick the victim into providing personal information or sending money. But the most common AI voice scams involve impersonating a family member or friend who is in trouble and needs money urgently. The scammer may even use the victim's own name or the name of a family member to make the scam more convincing. Scary, right?

A new survey called 'The Artificial Imposter' was conducted by international computer security company, McAfee with more than 7,000 people from 7 countries including India in early 2023. In the report, India tops the list of victims and 83% of Indians have lost money in AI-related scams. The survey also revealed that about 48% of Indians lost an average of ?50,000/-money to AI voice scams. The survey concluded that 69% of Indians were doubtful that they could identify the cloned version of a voice from the real thing. Despite 47% of these Indian adults being someone who has experienced or knows someone who has experienced some kind of AI voice scam. According to the survey’s report, McAfee found more than a dozen AI voice-cloning tools that were freely available on the internet. Some devices require only a basic level of experience and expertise, with just three seconds of audio enough to produce an 85% match. Recently, in November 2023, a 59-year-old woman lost ?1.4 lakh in an AI voice scam where the fraudster mimicked her nephew's voice. The caller mimicked her nephew residing in Canada by speaking in fluent Punjabi with the help of AI voice tools and claimed to be in urgent need of money by telling her that he was in an accident and was about to be jailed. He requested the women to transfer money immediately and to keep their conversation a secret from his parents. Unfortunately, by the time the 59-year-old woman realized that she had been scammed, she had already transferred money multiple times to the account specified by the fraudulent caller. So, how can we prevent AI voice scams? Here are a few tips to help you avoid AI voice scams

  • Never give out personal information over the phone unless you are certain of the caller's identity.

  • Be wary of callers who ask for money or personal information urgently.

  • If you are unsure about the legitimacy of a call, hang up and call the company back directly.

  • Be aware of the latest AI voice scam techniques. Scammers are constantly evolving their methods, so it is important to stay up-to-date on the latest scams.

  • Report suspicious activity. If you suspect you are being targeted by an AI voice scam, report it to the cybercrime department and the nearest police station.

Do inform your family and friends about this scam to help prevent them from falling victim to the growing AI voice scam in India.

Advertisement