AI scams: three artificial intelligence scams to look out for in 2024 - from investment fraud to voice cloning

It’s vital that people protect their social media accounts to prevent scammers
Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now

Research has revealed that nearly 75% of people fell victim to fraud in some way in 2023, with online scams surpassing more 'traditional' methods.

The emergence of AI has rendered conventional scam detection techniques - such as scrutinising language for errors - ineffective, with fraudsters now able to leverage tools like ChatGPT to effortlessly craft convincing emails and messages.

Hide Ad
Hide Ad

AI tools can enable scammers to certain verification processes, and can even allow them to replicate voices to get around voice recognition systems.

Scams.info's research revealed that one in seven adults have experienced financial losses due to fraud, and so their anti-fraud experts have picked out some of the most prevalent AI-scams in 2024 to help consumers avoid falling victim to them

Investment Fraud

Scammers have been known to target novice investors on platforms like TikTok, disseminating financial disinformation through pseudo-financial experts who encourage dubious investment opportunities.

Exploiting the buzz surrounding AI, scammers claim that they can use new AI tools to easily generate returns, deceiving would-be investors into parting with their money.

Hide Ad
Hide Ad

Nicholas Crouch at Scams.info says: “Watch out for investments that promise high returns with little risk and be sure to do comprehensive research before handing over money.

"Budding investors should also be aware of opportunities that ask you to recruit new investors; these often operate as Ponzi or pyramid schemes that while benefiting those at the top of the pyramid, very rarely benefit others involved.

"And finally, be conscious of the financial information you learn online, particularly on social media, there is an increasing amount of financial disinformation around investing that lures investors into sophisticated scams.”

A live demonstration uses artificial intelligence and facial recognition in a dense crowd at CES 2019 (Photo: DAVID MCNEW/AFP via Getty Images)A live demonstration uses artificial intelligence and facial recognition in a dense crowd at CES 2019 (Photo: DAVID MCNEW/AFP via Getty Images)
A live demonstration uses artificial intelligence and facial recognition in a dense crowd at CES 2019 (Photo: DAVID MCNEW/AFP via Getty Images)

Impersonating Loved Ones to Extort

One particularly malicious scam involves criminals mimicking the voices of loved ones to fabricate emergencies and extort money. Scammers can extract audio samples from social media posts, and then use AI to replicate the voices of family members or friends.

Hide Ad
Hide Ad

Crouch says: “It’s vital that people protect their social media accounts to prevent scammers having access to recordings of your voice and details of your wider family.

"For people with public social media accounts for content creation purposes, try creating a family ‘password’ or ‘codeword’ that can be used to verify identity verbally. Even those with private accounts may choose to do so as a precautionary measure.

"This password should be kept top secret and limited to family and close members of your support network and shouldn’t be anything that is predictable based on your social media accounts, or relating to family names or pets.

"If you struggle remembering the passwords make sure to keep a physical note of it rather than a digital one.” 

Bypassing Security with Voice

Hide Ad
Hide Ad

Similar to impersonating loved ones with the intent to extort, AI voice cloning can also be used to circumvent the use of voice identification as a security measure.

Security features like this are perhaps most prevalent in banking, particularly in Europe and the US, where banks implement multi-phase verification, combining voice with confirming recent transactions.

Crouch says: “Many banks are using multi-phase verification so that in addition to your voice, you’ll likely be asked to confirm the sum of a recent transaction, or where an amount was spent, so it’s unlikely that fraudsters would be able to bypass this step.

"However, if you’re concerned, you can reach out to your bank for advice, as well as following the precautions mentioned for protecting your voice."

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.