Amy Trapp was in her office at the Mill Valley school where she works when she got a call from an unknown number.
She picked it up, thinking it might have something to do with the school fire drill from earlier in the day. Instead, a familiar voice – one she knew better than any other – was on the line.
“It was my son’s voice on the phone crying, telling me ‘Mom, mom, I’ve been in a car accident!’” Trapp said. Instantly, she felt rising panic. Images of her son Will, away at college on California’s Central Coast, flashed through her mind: him lying on the side of the road in a pool of blood, or trapped in an overturned car.
Trapp was convinced that her son was in trouble. When a man came on the line and told her he was a police officer and that Will had injured a pregnant woman in the crash and had been taken to jail, she believed him, convinced by the unmistakable sound of Will’s voice.
She also put trust in another man who claimed to be a public defender representing Will and asked her to take more than US$15,000 (RM70,282) from her bank account to pay her son’s bail.
It wasn’t until Trapp’s husband called the police directly, hours into the episode, that the couple realised it was a scam. The men were apparently using technology powered by artificial intelligence to copy Will’s voice. Will was quietly studying in his living room throughout the ordeal.
Versions of this type of phone scam have been around for years, said Abhishek Karnik, senior director of threat research at digital security firm McAfee. The big difference now is the use of AI.
“The fact that it’s so easy to create a cloned voice, you can build a very strong emotional connection with the victim,” Karnik said. “If you add a sense of urgency or distress … they lose their sense of practicality,” he said.
Rapid advances in AI mean technology is now available that requires only a few seconds of a voice sample to create a digital facsimile of a person’s voice.
“Twenty years ago, you needed the resources of a Hollywood studio or a nation state to pull that off,” said FBI San Francisco Special Agent in Charge Robert Tripp. Now criminals can “fabricate a voice using AI tools that are available either in the public domain for free, or at a very low cost”.
Some companies building that technology tightly control who has access to it. But others allow users to upload a voice clip that an AI program can study to generate clips of them convincingly speaking any text a user types in, with up to 85% voice-matching accuracy, Karnik said.
That kind of short voice clip could easily be gotten from social media or elsewhere online, Karnick said.
Phone scams – with and without the use of AI – have rolled across California and the US in recent years.
Some try to convince older people their grandchildren are in trouble and in need of money, with the FBI estimating it has received more than 195 victim complaints about those particular scams, with nearly US$1.9mil (RM8.90mil) in losses, through September of last year. Other schemes make it appear the call is coming from the police department.
Some demand digital currency to settle a legal matter. A few years ago, an attorney testified before Congress that he lost a large sum to a phone scam almost identical to the one the Trapp family were targets of.
Tripp, the FBI agent, said his office did not have numbers on how many reported scams involved AI, but encouraged anyone who may have been a victim of such crimes to report them to the agency’s Internet Crime Complaint Center.
The California Attorney General’s Office said in an emailed statement that it is aware of the increasing use of AI in common scams, including phone scams, and encouraged victims to report them at oag.ca.gov/report.
Although it’s impossible to know precisely which technology the scammers used, Trapp is certain about one thing from that phone call in October: “There was zero doubt in my mind. I was talking to my son.”
In the moment, Trapp had no notion she was the target of a sophisticated scheme, her judgment clouded by terror and adrenaline as an unfamiliar voice took over the call.
After speaking to the person she thought was her son for 30 or 45 seconds, a man who said he was a police officer got on the phone. He told Trapp her son had run a stop sign while on his phone and had hit another car. He had suffered a broken nose and a neck injury, but would be OK. But the pregnant woman in the other car had been taken away, bleeding, the man said.
By this time, the school principal, Lisa Lamar, had joined Trapp. She grabbed Trapp’s phone and put it on speaker so they both could listen. The “officer” said a public defender would call.
The man who claimed to be Will’s assigned public defender and calling himself David Bell called a few minutes later. He told Trapp he had spoken to her son, said he was a good kid, and said he had managed to get Will’s bail knocked down from US$50,000 (RM234,275) to US$15,500 (RM72,625).
Could she get that money quickly, he asked? She could, Trapp said.
“He said, ‘Don’t tell (the bank) why you’re getting the money because you don’t want to tarnish your son’s reputation’,” Trapp said. In her frantic state, she agreed. “I would have done anything he said.”
Trapp jumped in her car and called her husband, Andy Trapp, who works at another local school, and went to pick him up.
Andy Trapp recalled that his wife was utterly convinced that she had spoken to their son.
“It all comes down to that voice being recognised by his own mother, who he speaks to several times a week,” Andy Trapp said. “I never, ever, thought I would ever fall for anything like that,” he added, noting he doesn’t pick up calls from unknown numbers and is careful about suspicious text and emails.
The couple raced home, where Amy Trapp began frantically packing the family camper van to drive down to be with their son. “We were just absolutely reeling,” she said.
Then the pair sped to their bank branch, headed inside, and asked to withdraw US$15,500, just as Trapp had been told to do.
Back home when they spoke again to Bell, he told them he would send a courier to pick up the money from their home. That set off alarm bells, and the couple parked their packed camper away from the house, now fearful of the “courier” and starting to realise something was very wrong.
After that call, “Something changed in me,” Andy Trapp said. “That sounded totally wrong.”
Finally, the ordeal got to her, Trapp recalled. She sank to her knees in front of the van in the street.
“Where is my son? Where is my son?” she screamed.
“That’s when I called the police station and the jail” in San Luis Obispo, Andy Trapp said. They had no record of the incident.
Ultimately, the Trapps did the right thing, said Tony Cipolla, the public information officer with the San Luis Obispo County Sheriff’s Department. “You call the place where they’re supposedly being held to find out, is this true?” he said.
Then, finally, Trapp called Will.
“Yo, what’s up?” her son answered calmly, ensconced in his living room and surrounded by his roommates, doing homework.
The spell was broken, but it was too much for Trapp. She handed the phone to her husband, who briefly explained the situation to their son, and then hung up to comfort his sobbing wife.
Trapp said she called the San Rafael Police Department, who said there was nothing it could do. The department did not respond to calls and emails from the Chronicle for this story.
Tripp, the FBI agent, said the attempt is still a crime, but prosecuting it would potentially involve going after international criminals. He said the agency monitors reports made to the Internet Crime Complaint Center for trends and has taken down some scammers as a result.
Since the incident, the Trapps have told their story to many people they know, hoping it will save others from suffering the same emotional distress, even if in the end no money changed hands.
Will Trapp isn’t sure how someone could have gotten a recording of his voice. He isn’t an avid social media user, and the accounts he does use are private. He said he sings and makes music, which he sometimes posts online.
“It’s hard to imagine how that could be used because it’s not like my speaking voice,” he said. “It’s really scary.” – San Francisco Chronicle/Tribune News Service