The next time you get a robocall with a voice generated by Artificial Intelligence, you can sue the caller.
The Federal Communications Commission has ruled that using AI to mimic the voice of a celebrity, politician, or a family member in an unwanted call is a violation of the Telephone Consumer Protection Act.
“If there’s no enforcement action, it just won’t stop,” said Zachary Zermay, a Consumer Protection Attorney in Fort Myers who specializes in robocalls and spam. “That’s what consumers want. They want these phone calls to stop. They’re annoying. They’re abusive.”
Under the new law, people who receive more than one illegal robocall can sue the caller for $1,500 per each call.
While Zermay admits that’s not a lot of money, “the point of the act is to get the calls to stop. If they’re facing the business end of a lawsuit, the calls, often times, stop,” said Zermay.
The FCC ruling came just two weeks after reports that some New Hampshire residents received phone calls telling them to skip the state’s primary.
The voice on the calls had been generated by AI to sound like President Joe Biden.
“Bad actors using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters,” said FCC chairwoman Jessica Rosenworcel.
Unfortunately, the scam calls work.
According to the Federal Trade Commission Americans lost $2.7 billion in 2023 to imposter scams.