The Anatomy of a Scam
Last year, a colleague called me, distressed, to share that their grandfather had been scammed out of over $10,000. The scammer pretended to be Gramps’ grandson, claiming to have been in a car wreck and urgently needing money. “But don’t tell Mom!” the scammer insisted, “I don’t want her to worry.” Concerned and eager to help, he withdrew the cash and handed it over to a courier. And just like that, the money was gone. Be alert, and know the anatomy of a scam.
Table of contents
Why Are the Elderly Targeted?
Such scams are all too common, preying on the vulnerabilities of individuals, especially the elderly. As people age, they tend to become more trusting, making them prime targets for malicious money schemes. The urgency of the scammer’s story, a common tactic, made it difficult to suspect foul play. This incident is not isolated but part of a growing trend of financial exploitation.
The New York Times recently published a story about a man who lost over $740,000 in retirement savings to similar scams. In comparison to the story above, both scams featured frenetic urgency, a heartbreaking ending, and more importantly– Artificial Intelligence.
AI: A New Weapon in Scamming
With the rapid advancement of artificial intelligence (AI), the sophistication and believability of scams are expected to increase dramatically. AI innovation is progressing at an exponential rate, with new tools emerging daily that mimic human behavior, create deepfake videos, and alter voices convincingly.
Deepfake Scams: A Growing Threat
Another article, recently published in The Guardian, discusses the increasing elaboration of deepfake scams. According to the article, the head of the world’s biggest advertising group was the target of an elaborate deepfake scam involving an artificial intelligence voice clone. Mark Read, the CEO of WPP, spoke of the attempted fraud in an email to leadership, warning others at the company to be vigilant and look out for calls claiming to be from other top executives within the company.
Corporations Are Not Immune
Unfortunately, this is not an isolated issue. Large corporations and companies are increasingly being targeted for by deepfakes. Startups are another area of particular interest to scammers.
Bloomberg reports that Ferrari was recently impacted by deepfake scams which were narrowly avoided through identification measures. CEO of Ferrari, Benedetto Vigna, was impersonated. The scam was narrowly avoided through one simple statement: I need to identify you.
The Guardian states AI voice clones have fooled banks and cybersecurity departments as well as duping financial firms out of millions of funds. Experts quoted by CNBC have warned it could get worse before it gets better. It is important to note: if a large company or astute professionals can be impacted by AI scams, how vulnerable does that leave the average individual?
Individuals at Risk: The Deepfake Dilemma
Imagine receiving a frantic call from a loved one, claiming an emergency, and the voice on the other end sounds exactly like them. This scenario is not far-fetched. Tools like Descript and Speechify can make it trivial to replicate someone’s voice, making it increasingly difficult to distinguish between real and fake. AI-generated deepfakes can create realistic videos of people saying things they never actually said. These technologies, while having legitimate uses, also provide new avenues for scammers to exploit.
Protecting Yourself and Your Loved Ones
All of us hope our loved ones or companies are never targeted. However, with the increasing prevalence and sophistication, protecting ourselves, our companies, and our loved ones is important. Here are some actionable steps:
1. Listen to Your Instincts
If something feels off, it probably is. Trust your gut feelings and take a moment to pause and assess the situation before acting.
2. Stay Informed About Current Scams
Know the anatomy of a scam. Knowledge is power. By being aware of the types of scams that are prevalent, you can better recognize when you are being targeted. Share this information with vulnerable family members and friends.
3. Explain Technology to Others
In recent years, audio deepfake technology has become widely available and low-cost. Some AI models generate realistic imitations of a voice using only a few minutes of audio, often easily obtained via social media.
Explain this technology to your friends and family. Read the news and stay vigilant regarding the current technologies available and how to protect from cyberattacks.
4. Set Up Verification Processes
Establish a verification system with your loved ones. For example, agree that if you ever ask them to do something unusual, they should call you back on a known number before taking any action. This simple step can prevent impulsive decisions based on fraudulent requests.
5. Identify Experts to Call
Have a list of trusted contacts who can provide guidance if you suspect something is amiss. Incident response companies exist to help both businesses and individuals navigate potential scams. Don’t hesitate to reach out for professional assistance if needed.
Conclusion: Staying Vigilant in a Deceptive World
These stories serve as cautionary tales of the dangers we face today, and the increasing risks posed by the advancements in AI. While technology continues to evolve, we must remain vigilant and proactive in protecting ourselves and our loved ones from becoming victims of sophisticated scams. We must stay informed, trust our instincts, and establish strong verification processes to create a safer environment in an increasingly deceptive world. Know the anatomy of a scam.