I recently fell victim to a sophisticated phone scam that used AI to clone the voice of a family member who was “in trouble” and needed immediate financial assistance (bail) following a serious incident that involved other victims. The generated voice was so convincing, that I let my guard down to help out.

When I realized I’d been deceived, I felt stupid, and ashamed and didn’t want to talk about it with anyone. I’m still smarting but I want to warn folks that this can happen to them too.

Resourceful criminals can create audio clips that perfectly simulate the voices of people we know. Don’t assume that recognizing someone’s voice makes it real. Immediately work to set up private passwords with family members and close friends. Then, if someone calls asking for money, ask them for the password. If it sounds suspicious, hang up.

It’s better to be rude than to get ripped off. If the caller persists, saying there’s a court “gag order,” forbidding the sharing of specific information with others for clarification, that’s a definite red flag. Don’t fall for it.

To prevent our voices from being cloned and used to bilk others, when receiving calls from unfamiliar phone numbers, asking “Is this so-and-so?” or whoever, don’t answer “yes.” It can be used to create a voice print, and also sounds like we have agreed to some transaction.

Stay vigilant to keep from becoming a victim of fraud. Check out this FBI-released public service announcement for better suggestions.

David Wade
Portland

Related Headlines

Join the Conversation

Please sign into your Press Herald account to participate in conversations below. If you do not have an account, you can register or subscribe. Questions? Please see our FAQs.

filed under: