Posted on

AI ‘deepfake’ videos make investment scams harder to spot as Americans lose billions

AI ‘deepfake’ videos make investment scams harder to spot as Americans lose billions

Fraudsters are still active and artificial intelligence and other sophisticated tools are increasing the risks for potential victims.

According to the FBI’s Internet Crime Complaint Center, Americans were scammed online out of an estimated $12.5 billion last year, up from $10.3 billion in 2022, but the total could be much higher. The FBI solved a case in which they found that only 20% of victims reported these crimes.

Fraudsters continue to move the goalposts using different tricks and techniques, and AI is playing an increasingly important role. The FBI estimates that 39% of victims last year were scammed based on “deepfake,” or doctored videos that use AI technology to manipulate or misrepresent what someone actually did or said. The videos are used for investment scams, such as romance scams, and other means.

“This national crisis is likely to get worse in the coming years as fraudsters can now use artificial intelligence to create deepfake videos of business leaders, celebrities, politicians and romantic suitors that are difficult to detect,” said Social Catfish, a company that Helps people combat online crimes, especially those of a romantic nature, by checking photos using reverse image search.

The role of AI in financial fraud

Artificial intelligence and other sophisticated tools have made it harder to avoid digital scams.

Artificial intelligence and other sophisticated tools have made it harder to avoid digital scams.

Fraudsters can use AI technology to duplicate voices and trick people into sending money or revealing personal information by impersonating family members, colleagues or friends, said Christian Romero, community manager at Chase. He spoke to about 100 residents during an Oct. 10 anti-fraud meeting in Sun City West. According to the FBI report, Arizona ranks fifth in the nation in both online complaints per capita and financial losses per capita.

Using AI, fraudsters can process larger amounts of data, allowing them to try more password combinations to break into a victim’s account, Romero added. To prevent this, everyone should use strong passwords, change them frequently and use two-factor authentication, he said.

Last year, Americans filed more than 880,000 online fraud complaints with the FBI’s Internet Crime Complaint Center. Social Catfish estimates that 96% of reported lost money was never recovered, in part because most scammers live abroad. Many scammers demand payments using cryptocurrencies, which play a role in most investment-related crimes. In some crypto schemes, scammers offer victims a chance to recover money lost in previous crypto scams.

The FBI ranked the highest proportion of online crimes last year in the “investments” category, which covers a wide range of topics but cites high-return pitches as a common theme. Another prominent category was scams, in which fraudsters infiltrate business email accounts and use the obtained information, including contact lists, to demand payments for various services.

Also common are ransomware attacks, in which fraudsters infect victims’ computers and demand payments to unlock the computers and the information stored on them. Then there are technology/customer support crimes in which scammers convince victims that their computers are infected and promise to clean them for a fee.

Ways to detect a deepfake video

Online crimes involving fake or manipulated videos are relatively new. According to Social Catfish, here are some ways you can try to spot fake videos or photos:

Unnatural movements: Look for subtle irregularities in videos that include facial movements or expressions, such as the way eyes and mouth move. Focus on possible discrepancies between the words spoken and the lip movements.

Inconsistent lighting and shadows: Pay attention to lighting, as manipulated videos often struggle to maintain consistency, especially when material from different sources has been combined. Unusual skin tones could be a clue.

Audio irregularities: Pay attention to changes in the tone or quality of a person’s voice. The sounds may seem out of sync and the tone may seem flat or emotionless.

Try to authenticate a video by checking with the source that claimed to have created it. On the love front, potential victims might have a conversation with a person who doesn’t exist, at least in the way portrayed. Always insist on a face-to-face meeting before making any financial or other commitments. If the other person refuses or claims to live in another country, consider that a red flag.

Other scams are still widespread

Even with the rise of deepfake videos and other technologically sophisticated schemes, criminals are still resorting to many of the same methods they have used in the past.

A scam involves fraudsters claiming knowledge of a warrant for a potential victim’s arrest and demanding, for example, $500 in gift cards to solve the case, Capt. Brian Stutsman of the Maricopa County Sheriff’s Office said at a Speech at Sun City West Anti-Fraud Event. Another case is the “grandparent” scam, in which potential victims are asked for bail to free a grandchild who is supposedly being held in a Mexican prison.

“Don’t give anyone anything on the phone,” Stutsman advised. None of the kidnapping scams investigated by the sheriff’s office proved authentic, he added, but some victims were still tricked into handing over money. Seniors, in particular, tend to be trustworthy and polite when lawyers contact them on the phone, but “it’s OK to be rude and hang up,” he said.

Lance Hunzeker, a financial crimes officer with the Maricopa County Sheriff’s Office, emphasized that it’s inbound calls, text messages and emails that people need to be vigilant about.

“Anyone can fall victim to scams and scams,” Romero said. “I’ve been a victim of fraud and been defrauded, and I’m a banker (and) I know how to protect myself.”

Instead of feeling ashamed or embarrassed, it is important for victims to reach out to bankers and others who have training and resources to help, he said. It is also important to report crimes to law enforcement authorities and, above all, to remain skeptical.

Reach the author at [email protected].

This article originally appeared on Arizona Republic: Can You Spot an Investment Scam? Deepfake videos can be difficult to detect