Scammers clone girl’s voice using AI in 'kidnapping scam,' demand $1 million as ransom

Scammers clone girl’s voice using AI in 'kidnapping scam,' demand $1 million as ransom

Apr 17, 2023 - 13:30
 0  21
Scammers clone girl’s voice using AI in 'kidnapping scam,' demand $1 million as ransom

The popularity of ChatGPT has piqued the curiosity of numerous firms in artificial intelligence, and several are working on the technology. Google, Microsoft, and Meta are investing in AI technology, and their emphasis over the next few years appears to be the realm of artificial intelligence. While the notion of artificial intelligence has been around for a long time, its popularity skyrocketed with the introduction of OpenAI tools such as ChatGPT and DALL.E.

According to recent rumours, even Elon Musk is preparing to launch his new AI startup, X.AI, after being sceptical of artificial intelligence and asking studios working with AI to halt all development. 

However, experts have frequently cautioned us about the dark side of AI and how it may be abused by unscrupulous people. OpenAI CEO Sam Altman had previously voiced alarm about the potential exploitation of AI. 

In light of these claims, a fresh instance has horrified the globe, in which fraudsters employed artificial intelligence to clone a teenager’s voice and demand a ransom from her mother.

According to a story by WKYT, a CBS News station in the United States, a lady from Arizona, Jennifer DeStefano, received a call from an unknown number one day that flipped her life upside down. 

According to DeStefano, her 15-year-old daughter was on a skiing vacation when she received the call. When the mother picked up the phone, she heard her daughter’s voice utter ‘Mom,’ followed by tears. ‘Listen here, I’ve got your daughter,’ said a guy. 

“This is how things will play out. You call the cops, you call anybody, I’m going to inject her drugs, I’m going to have my way with her, and I’m going to drop her off in Mexico.”

The mom went on to say that she could hear her daughter’s voice pleading for assistance in the background. The guy then asked $1 million to release the adolescent. When DeStefano replied she didn’t have so much money, the ‘kidnapper’ decided to settle for $50,000.

DeStefano claimed that when she received the call, she was at her daughter’s dancing studio with other mothers. One dialled 911, while the other phoned DeStefano’s spouse. It was verified within minutes that her teenage daughter was safe on her skiing trip. However, the woman told that the voice on the phone sounded exactly like her daughter.

“It was never a question of who is this? It was completely her voice. It was her inflection  It was the way she would have cried,” she told the local news media and added, “I never doubted for one second it was her. That’s the freaky part that really got me to my core,” said the mother. 

In a recent Facebook post, the woman discussed the experience. She cautioned people to keep safe from similar occurrences by creating a ‘family emergency word or question that only you know, so you can authenticate you are not being scammed using AI’ and sharing the news item on the social networking site.

“However, kids who do have public accounts, including our youngest, should be especially concerned because of contractual sponsorships.” This was also the reason his account went silent for months, as it occurred just before his second tournament of the season. We weren’t sure if it was linked to another true kidnapping of a buddy, for whom the kidnappers had recently been sentenced to life in prison.” 

“Only a voice tape was supplied to his wife in that scenario, and we were unable to reclaim him. Furthermore, I was not directed to wire money; rather, I was to personally meet the kidnappers who were to abduct me in accordance with their instructions. Police were on their way during the call to intersect as we were trying to navigate the situation,” the mother added

“My greatest fear is that this will be used to physically entice and kidnap others, just as they demanded of me. Please report if this has occurred to you or someone you know!!! The only way to stop this is for the general population to raise awareness!! Also, have a family emergency phrase or question that only you know so that you can confirm you are not being duped by AI! Be careful,” she concluded. 

Read all the Latest NewsTrending NewsCricket NewsBollywood News,
India News and Entertainment News here. Follow us on FacebookTwitter and Instagram.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow