AI technology is making it easier for criminals to clone voices.
HOUSTON — You get a frantic call from a loved one who said they need help, they need money, and they need it now.
But you take a step back to think, is it really your family or a scam?
Technology is making it easier for criminals to clone voices, and sometimes it can be a scam using a voice that sounds just like someone you know.
In mere seconds, KHOU 11 Reporter Amanda Henderson was able to get an AI app to scan her face and listen to a few seconds of her voice to create an AI video that appears to show her face and use her voice. However, the video is fully AI.
You know it’s AI because we told you. But the question becomes, without knowing, could you tell if the video or voice is fully AI?
This is the reality people across the nation are facing, including Houstonian Charles Lafkoff.
“I got, I got fooled,” Lafkoff said.
Lafkoff said earlier this year he believed he was speaking on the phone with his son and his attorney, who allegedly needed money to get out of jail.
“He [appeared to be] very nervous. He [appeared to be] upset because he was worried,” Lafkoff said.
That worry translated to Lafkoff to the point he put $15,000 into a Coinme machine. He believed it would be used by his son’s alleged attorney as bail.
After being asked for more money, Lafkoff called his son directly.
“I go, ‘You weren’t at the jail, the Harris County Jail?’ He goes, ‘No, what are you talking about?’ That’s when I knew I got taken for a ride,” Lafkoff said.
“You said he even had kind of an inflection in his voice that made it not sound like AI or not sound like a scam,” Henderson said.
“The AI-generated voice not only mimicked my son’s voice, but it mimicked an emotional situation,” Lafkoff said.
More than 500,000 people have reported imposter scams so far in 2025, according to the Federal Trade Commission.
Lafkoff is among the 20% impacted financially. The losses total nearly $1,700,000,000.
Jon Clay with cybersecurity company Trend Micro explains how this continues to happen.
“Amanda, the technology has gotten to the point where it is going to be very difficult for a human being to identify if the image is real, if the voice is real,” Clay said. “Everybody probably has their voice somewhere on the internet. They’ve got a video of themselves somewhere on the internet. They definitely have pictures of themselves on the internet, and so they scrape that stuff up, and they can then take these, these AI applications and say, hey, use this voice.”
Clay explains that there are some solutions to stopping your voice and/or information from ending up in these scams.
“You’re going to have to use technology to defeat the technology that the adversaries or the scammers are using,” Clay said.
That includes using two-factor authentication, having a code word with only family members would know, or even using AI to scan possible scam messages.
There are now new state laws aimed at addressing AI. One that went into effect in September makes it illegal to harm or defraud someone by using another person’s voice, name or likeness without their permission.
Got a news tip or story idea? Email us at newstips@khou.com or call 713-521-4310 and include your name and the best way to reach you.
