New AI scam is targeting Indians: All the details


With the rise in popularity and adoption of artificial intelligence (AI) tools, it is now easier to manipulate images, videos and voices of friends and family members. Earlier this year, it was reported that cyber criminals are using AI-powered voice to target people. According to a new report, India tops the list of victims.
Scammers are using AI to sound like family members in distress and Indians are falling for such scams. According to a report by McAfee, more than half (69%) of Indians think they don’t know or cannot tell the difference between an AI voice and real voice.
Furthermore, about half (47%) of Indian adults have experienced or know someone who has experienced some kind of AI voice scam, which is almost double the global average (25%), said the report, titled ‘The Artificial Imposter’.

“AI technology is fueling a rise in online voice scams, with just three seconds of audio required to clone a person’s voice. The survey was conducted with 7,054 people from seven countries, including India,” the report highlighted.
Indians losing money
According to a McAfee report, 83% of Indian victims said they had a loss of money- with 48% losing over Rs 50,000.
“Artificial Intelligence brings incredible opportunities, but with any technology there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” said Steve Grobman, McAfee CTO.

Why AI voice cloning is dangerous
Everybody’s voice is unique, which essentially means that it is the spoken equivalent of a biometric fingerprint. Hence, speaking is an accepted way of establishing trust.
But with 86% of Indian adults sharing their voice data online or in recorded notes at least once a week (via social media, voice notes and more), voice cloning has become a powerful tool for cybercriminals.
More findings of the report
McAfee said that more than half (66%) of the Indian respondents said they would reply to a voicemail or voice note purporting to be from a friend or loved one in need of money.

“Particularly if they thought the request had come from their parent (46%), partner or spouse (34%), or child (12%). Messages most likely to elicit a response were those claiming that the sender had been robbed (70%), was involved in a car incident (69%), lost their phone or wallet (65%) or needed help while travelling abroad (62%), the report found.
There has been a rise of deepfakes and disinformation which has led to people being more wary of what they see online. According to the survey, 27% of Indian adults said they are now less trusting of social media than ever before and 43% being concerned over the rise of misinformation or disinformation.


Source link

Leave a Comment