AI cloning voices - scary stuff

…I guess the best way to protect ourselves would be to ask specific questions that only our loved one would know the answer to -

https://www.sfchronicle.com/bayarea/article/ai-phone-scam-18561537.php

Need a family safe-word.

2 Likes

Yup - like the old days my D and I had the secret word in case someone other than me was picking her up from school.

Sorry, can’t read it, requires subscription. Summary, please.

What I don’t understand is “how?” AI generally needs a learning dataset, so to be used for a person, the AI would need a good data set of examples from that person.

That could work for, say, somebody famous who has a large number of examples of their voice out there. But for your everyday person? So, if it sounds like your son, unless your son has a few hundred examples of their voice available, it’s unlikely to be an AI.

However, there are also people who can do the same thing, and people who are really good at it need to hear a lot less in order to emulate a voice.

AIs are good at predictive responses. So unless you regularly get phone calls from your friends and family asking for your bank account number, it’s unlikely that AI will be able to emulate how your friends or family would ask for any sensitive information.

All that being said, scam artists have enough success without having to fake a voice. Why spend the amount if time and effort that it would take to create these fake voices when they are unlikely to provide any better success that a regular every day voice?

@HImom sorry, it locked on me too - Amy Trapp was in her office at the Mill Valley school where she works when she got a call from an unknown number.

She picked it up, thinking it might have something to do with the school fire drill from earlier in the day. Instead, a familiar voice — one she knew better than any other — was on the line.

“It was my son’s voice on the phone crying, telling me ‘Mom, mom, I’ve been in a car accident!’ ’’ Trapp said. Instantly, she felt rising panic. Images of her son Will, away at college on California’s Central Coast, flashed through her mind: him lying on the side of the road in a pool of blood, or trapped in an overturned car.

Trapp was convinced that her son was in trouble. When a man came on the line and told her he was a police officer and that Will had injured a pregnant woman in the crash and had been taken to jail, she believed him, convinced by the unmistakable sound of Will’s voice. She also put trust in another man who claimed to be a public defender representing Will and asked her to take more than $15,000 from her bank account to pay her son’s bail.

It wasn’t until Trapp’s husband called the police directly, hours into the episode, that the couple realized it was a scam. The men were apparently using technology powered by artificial intelligence to copy Will’s voice. Will was quietly studying in his living room throughout the ordeal.

Versions of this type of phone scam have been around for years, said Abhishek Karnik, the senior director of threat research at digital security firm McAfee. The big difference now is the use of AI.

“The fact that it’s so easy to create a cloned voice, you can build a very strong emotional connection with the victim,” Karnik said. “If you add a sense of urgency or distress … they lose their sense of practicality,” he said.

Rapid advances in AI mean technology is now available that requires only a few seconds of a voice sample to create a digital facsimile of a person’s voice.

“Twenty years ago, you needed the resources of a Hollywood studio or a nation state to pull that off,” said FBI San Francisco Special Agent in Charge Robert Tripp. Now criminals can “fabricate a voice using AI tools that are available either in the public domain for free, or at a very low cost.”

Some companies building that technology tightly control who has access to it. But others allow users to upload a voice clip which an AI program can study to generate clips of them convincingly speaking any text a user types in, with up to 85% voice-matching accuracy, Karnik said.

That kind of short voice clip could easily be gotten from social media or elsewhere online, Karnick said.

Phone scams — with and without the use of AI — have rolled across California and the U.S. in recent years.

Some try to convince older people their grandchildren are in trouble and in need of money, with the FBI estimating it has received more than 195 victim complaints about those particular scams, with nearly $1.9 million in losses, through September of last year. Other schemes make it appear the call is coming from the police department. Some demand digital currency to settle a legal matter. A few years ago, an attorney testified before Congress that he lost a large sum to a phone scam almost identical to the one the Trapp family were targets of.

Tripp, the FBI agent, said his office did not have numbers on how many reported scams involved AI, but encouraged anyone who may have been a victim of such crimes to report them to the agency’s Internet Crime Complaint Center.

The California Attorney General’s Office said in an emailed statement that it is aware of the increasing use of AI in common scams, including phone scams, and encouraged victims to report them at Consumer Complaint Against A Business/Company | State of California - Department of Justice - Office of the Attorney General.

Although it’s impossible to know precisely which technology the scammers used, Trapp is certain about one thing from that phone call in October: “There was zero doubt in my mind. I was talking to my son.”

In the moment, Trapp had no notion she was the target of a sophisticated scheme, her judgment clouded by terror and adrenaline as an unfamiliar voice took over the call.

After speaking to the person she thought was her son for 30 or 45 seconds, a man who said he was a police officer got on the phone. He told Trapp her son had run a stop sign while on his phone and had hit another car. He had suffered a broken nose and a neck injury, but would be OK. But the pregnant woman in the other car had been taken away, bleeding, the man said.

By this time, the school principal, Lisa Lamar, had joined Trapp. She grabbed Trapp’s phone and put it on speaker so they both could listen. The “officer” said a public defender would call.

The man who claimed to be Will’s assigned public defender and calling himself David Bell called a few minutes later. He told Trapp he had spoken to her son, said he was a good kid, and said he had managed to get Will’s bail knocked down from $50,000 to $15,500.

Could she get that money quickly, he asked? She could, Trapp said.

“He said, ‘Don’t tell [the bank] why you’re getting the money because you don’t want to tarnish your son’s reputation,’ ” Trapp said. In her frantic state, she agreed. “I would have done anything he said.”

Trapp jumped in her car and called her husband, Andy Trapp, who works at another local school, and went to pick him up.

Andy Trapp recalled that his wife was utterly convinced that she had spoken to their son.

“It all comes down to that voice being recognized by his own mother, who he speaks to several times a week,” Andy Trapp said. “I never, ever, thought I would ever fall for anything like that,” he added, noting he doesn’t pick up calls from unknown numbers and is careful about suspicious text and emails.

The couple raced home, where Trapp began frantically packing the family camper van to drive down to be with their son. “We were just absolutely reeling,” she said.

Then the pair sped to their bank branch, headed inside, and asked to withdraw $15,500, just as Trapp had been told to do.

Back home when they spoke again to Bell, he told them he would send a courier to pick up the money from their home. That set off alarm bells, and the couple parked their packed camper away from the house, now fearful of the “courier” and starting to realize something was very wrong.

After that call, “Something changed in me,” Andy Trapp said. “That sounded totally wrong.”

Finally, the ordeal got to her, Trapp recalled. She sank to her knees in front of the van in the street.

“Where is my son? Where is my son?” she screamed.

“That’s when I called the police station and the jail,” in San Luis Obispo, Andy Trapp said. They had no record of the incident.

Ultimately, the Trapps did the right thing, said Tony Cipolla, the public information officer with the San Luis Obispo County Sheriff’s Department. “You call the place where they’re supposedly being held to find out, is this true?” he said.

Then, finally, Trapp called Will.

“Yo, what’s up?” her son answered calmly, ensconced in his living room and surrounded by his roommates, doing homework.

The spell was broken, but it was too much for Trapp. She handed the phone to her husband, who briefly explained the situation to their son, and then hung up to comfort his sobbing wife.

Trapp said she called the San Rafael Police Department, who said there was nothing they could do. The department did not respond to calls and emails from the Chronicle for this story.

Tripp, the FBI agent, said the attempt is still a crime, but prosecuting it would potentially involve going after international criminals. He said the agency monitors reports made to the Internet Crime Complaint Center for trends and has taken down some scammers as a result.

Since the incident, the Trapps have told their story to many people they know, hoping it will save others from suffering the same emotional distress, even if in the end no money changed hands.

Will Trapp isn’t sure how someone could have gotten a recording of his voice. He isn’t an avid social media user, and the accounts he does use are private. He said he sings and makes music, which he sometimes posts online.

“It’s hard to imagine how that could be used because it’s not like my speaking voice,” he said. “It’s really scary.”

1 Like

Thanks so much @JustaMom ! How terrifying! People don’t question when they think it’s a panic situation. I’m glad at least that family didn’t lost money but awful the scammers are getting increasingly clever! Yikes!

This is what we’ve always done. But make it something really odd like “What did the fox say? Ans: Turtles fly” Something really unrelated to actual facts.

My mom got a call like this. It sounded like one of her teen grandsons crying about a car accident and then a man got on the phone explaining that her grandson had caused an accident, etc. Luckily she knew he was in class and was totally fine. Also, he was only 14 and didn’t have his license or access to a car. She hung up but was a little shaken because the voice sounded exactly like her grandson.

1 Like

Clickbait.

My wife, a specialist in AI, says that no, you can’t clone a voice with a few seconds of recording.

That makes zero sense, when you consider just how many different sounds a person uses in speech, the vowels and consonants, the different combinations, the inflections, the places where a person puts emphasis, etc. Nobody gives a good sample of their voice in a few seconds over the phone or in passing.

Second, it ignores the modus operandi of the people who scam the middle class. That $15,000 sounds like a lot, but, for the scammers to collect the info on multiple people, only one of many who fall for the scam, that $15,000 is pennies. They failed here, and they fail in most cases, because most people who demand to speak to their loved one, and, if the police officer refused, they would demand to know where this was, and go there themselves.

Moreover, the police officer doesn’t set bail, a judge does. The police don’t collect bail, the court does. How were they asking this woman to pay, with Amazon gift cards? To a random account?

The entire story sounds lurid and overhyped, from describing the what the woman was supposedly thinking “Images of her son Will, away at college on California’s Central Coast, flashed through her mind: him lying on the side of the road in a pool of blood, or trapped in an overturned car.”

AI is scary, and now media outlets are feeding into and feeding off of that fear.

This entire article is a good example. They make some exaggerated claim, feed you a story which supposedly describes a situation which supports their claim, feed you some more questionable statistics and unsupported claims.

Voice cloning is a problem, but mostly for people for whom there is a large amount of recordings available.

Voice cloning can also work for phone scams in which the scam includes having a celebrity speaking on the phone.

I don’t know if an AI can have a voice conversation like an AI can do by text, but if they can’t, it’s just a matter of time. So they just need to replace all of the scammers who would call and yell that your computer was sending viruses to the net with an computer. That’s not cloning, exactly, but AI-generated.

That lawyer who was making the claim was in 2020. The amount of training that AI needed in 2020 to clone a voice makes that claim unlikely to be true. It’s more likely that he is simply unwilling to admit that he couldn’t tell that a person who he was talking to wasn’t his son, and has convince himself that it was “AI”. Even for the amount of money that they could wring from a lawyer, that would have been prohibitive at that point.

People who fall for scams, especially people who are supposed to be smart, tend to assign superpowers and all sorts of extreme methods to their scammers. It’s easier mentally than admitting that they were taken in by very simple methods.

Scams work because they play to our fears and hopes, and it is less how smart the victim is, and more how sensitive they are to that particular fear or hope.

2 Likes

Honestly, I think some people just fall for a crying voice that sounds “like” their loved one and story but don’t want to admit they were just duped by a voice actor/team and scripts. They want it to have BEEN the loved one’s voice, because of course that’s the only reason they fell for the scheme— even if no $$$ was lost.

D is in cinema and agrees there has to be a much larger body of work of the person to be imitated to have AI mimic a voice.

1 Like

That all makes sense - I didn’t go that deep into it.

1 Like

Actually, voices can be cloned with as little as one minute of noise free sample. Additionally, these scams use phone calls which are pretty low bandwidth making exact matches not necessary. All you need is something close. Now, use that “close clone” over a phone call and target older people and you have a winning combination.

1 Like

I’m not sure why a call to their son wasn’t response #1. That would have ended it.

2 Likes

My mom had a phone call similar to this about 5 years ago. The caller said “Hi Grandma, this is Nathan.” Mom was immediately on the alert because 1.) No one calls her grandma, but rather something shorter, and 2.) Nathan had never called my mom before and was only about 14 when this call took place. But, the caller sounded young, so she didn’t hang up right away, in case it was him. He then said he had been in an accident and needed money, etc… She hung up.

I agree with @MWolf . If someone young called in a panic, crying, saying “Mom, mom, help me!”, the first instinct is probably not “This is a scam.” I have a daughter and a son. I think it would be hard for me to fall for a male pretending to be my son, as his voice is very distinctive, but I’m not so sure about my daughter.

Luckily, I have read a lot of these stories in the last year or so, and I think I would be suspicious. But I love the idea of a safe word anyway. I’ve thought of that before, so no time like the present.

I’m linking the Scams thread, so people can be aware. http://talk.collegeconfidential.com/t/scams-youve-encountered

Edit: The phone call was actually creepy because my mom has an unlisted phone number and my nephew was only 14, so how did they get her number and his name? That was the scary bit.

That was the tip off when my parents’ friend got this kind of call - they opened with “grandma” which was not what their grandchildren called them.

We just need to be trained that ANY time we get a concerning call from ANYONE, we hang up and call the person or agency directly.

3 Likes

I also don’t answer my phone for numbers not in my contacts. If it’s important, they’ll leave a message.

3 Likes

This topic was automatically closed 180 days after the last reply. If you’d like to reply, please flag the thread for moderator attention.