A woman from the city of Boise said that she already of scams like sending money to save a loved one’s life from Mexico, but this time she said this was a kind of new scam I felt.
Introduction: A plan to trap the woman
BOISE, Idaho – Phone scams are not new nowadays, most people have been already triggered by phone call and message links scams in which scammers are trying to convince the victims to put their hard-earned money into them.
A lady victim Barbara Vaughn, from Boise, a city in Idaho, became a victim of the mysterious scammers. According to her, the scammers used Artificial Intelligence software to change the voice and sound like her grandson, convincing her to give over $15,000.
According to Vaughn, an individual posing as a prosecuting attorney contacted her and informed her that her grandson was in custody following a car accident with a pregnant woman.
However, she already knew how to find these kinds of scammers but was a bit hesitant. She was almost convinced after hearing what happened with her grandson’s voice, repeating personal information and using words he was known for using.
The scammers were convinced that her grandson needed $15,000 for a bond, she said.
Subsequently, Vaughn explained that the fraudsters instructed her to withdraw the funds from her bank and package them, and a bail bondsman later collected the package from her residence. An individual arrived, retrieved the cash, and departed with her funds.
It was normal to me to get scammed on the phone, and I knew how to respond to them, ignore them, and as well as open their messages, Vaughn said. When this happened to me, it was so real, never felt that he was trying to convince me. I was like speaking to my grandson, but I knew it was him, and that really got me alert.
Game Over: Police Start Investigation
As per the statement, Vaughn reached out to the police after she got the car’s license plate number information that showed up at her house, and police are investigating this situation.
The Boise Police Department has acknowledged the situation, recognizing the evolving nature of scam tactics. The community is urged to exercise increased caution when receiving such calls.
What is an AI-driven scam?
AI-driven scams are illegal activities that are handled by smart technologies such as artificial intelligence to cheat people and trick them into a plan to give up on their money. It is not only limited to obtaining money but also to receiving people’s personal information to use for their own benefit.
What are the types of AI scams?
Let’s highlight some types of AI scams:
Deepfakes: This is the type of artificial intelligence utilized by the scammers to seemingly legitimate audio and video. Scammers use large datasets of images, video, and audio to duplicate the voice and claim to be of a person. The only aim to use this technology is to make financial fraud transactions.
ChatGPT Phishing: Email phishing refers to sending out illegal messages like fraudulent transaction links, and emails purporting to be from a genuine source such as a bank, technology provider, or government department. Scammers have long been using it to trick people through online mediums such as they will make you visit a website that could lead to your bank details or other personal information being stolen.
Phishing with AI-generated Content: AI tools can create real look-alike phishing emails or messages to trap people into a piece of personal information or click on malicious links.
AI-Enhanced Robocalls: AI is used to make automated phone calls with highly convincing human-like voices, tricking people into believing they are speaking with a real person to obtain money or information illegally.
Fake Social Media Profiles: Nowadays AI is heavily used to generate realistic social media profiles to impersonate someone, often to cheat others or engage in identity theft.
Content Generation for Fake News: Scammers use AI-generated articles, blog posts, or social media content to spread false information or disinformation, influencing public opinion or damaging reputations.
AI-Enhanced Tech Support Scams: Scammers use AI to mimic tech support services, claiming to fix computer issues remotely while infecting devices with malware.
AI-Driven Email Compromise: AI tools can analyze email communication patterns to impersonate business executives and request fraudulent transfers of funds.
These scams are becoming more sophisticated thanks to technological advances, and they present a new frontier in how we must keep our money safe and steer clear of scammers. As AI becomes increasingly common, knowing how to safeguard against AI-driven scams will be crucial.
How do scammers use AI voice changers to deceive people?
There are several ways scammers can use AI voice changers to deceive people
1. Impersonation: Scammers can use AI voice changers to impersonate someone the victim knows and trusts, such as a family member, friend, or authority figure. By mimicking their voice, scammers can request sensitive information or even financial assistance.
2. Fraudulent Calls: They may use AI voice changers to create convincing voices of bank representatives, government officials, or tech support personnel. Victims are then manipulated into sharing personal and financial details or making payments.
3. Robocalls: AI-driven robocalls can use voice changers to mimic real individuals, making it appear as though a live person is on the line. This can increase the effectiveness of scams, such as those involving fake offers, surveys, or debt collection.
4. Social Engineering: Scammers use AI voice changers to carry out social engineering attacks, gaining the victim’s trust by posing as someone they know. This trust is exploited to extract sensitive information or financial assets.
5. Blackmail and Extortion: Scammers create fake audio recordings of victims and threaten to release embarrassing or incriminating content. The AI voice changer can make these threats seem more credible.
To protect against AI voice changer scams, individuals should remain cautious, verify the identity of callers, and avoid sharing personal or financial information over the phone without adequate verification. Additionally, using voice recognition or biometric authentication features when available can help mitigate the risk of falling victim to such scams.