Intelligence Report | November 2026 - Flipbook - Page 9
THE EMERGING THREAT POSED BY AI
By: Laura Alix
Human manipulation is at the heart of every fraud
scheme, but the rapid adoption of artificial intelligence
enables bad actors to take it to a scale never before
seen. Is the banking industry prepared to deal with
this fast-emerging threat?
AI turns fraud into a numbers game; the perpetrator
doesn’t need to succeed in every single attempt, but if they
fraud losses to $40 billion as soon as 2027, compared with
around $12 billion in 2023.
“AI use is growing rapidly,” observes Steve Sanders,
chief risk officer and chief information security officer with
CSI. “Right now, the criminals are making better use of it
than most security teams.”
Scams using deepfake media can be especially pernicious
can con a consumer or business out of money in one or two
because they’re often initiated outside the walls of a bank.
instances out of 100, that can still be a significant payday.
Therefore, they don’t come to a banker’s attention until a cus-
“Because of AI, [criminals] can replicate fraud schemes
tomer has either already lost money or is determined to send
much quicker than the old school way of doing it, where a
money to a scammer who’s convinced them they are a grand-
human had to do each one,” says Sarah Beth Felix, founder
child in need of bail money or a long-distance lover. That’s
and president of Palmera Consulting.
why customer-facing bank employees need to be empowered
Scams — when a victim is manipulated into giving up
to have difficult conversations with potential victims, Felix
their account information or sending money to a bad actor
says. That may be simple and low tech, but it’s far from easy.
— are a type of fraud. Romance scams, investment scams
“No one wants to have difficult conversations because
and business email compromise are not even particularly
they’re so worried about that customer being upset, they can’t
novel forms of fraud on their own. But by using AI, fraud-
do what they want so they take all their deposits and leave,”
sters can more quickly amass data on their targets and
she says. “Taking a hard line and saying, ‘We will not process
reach more potential victims. And using generative AI, they
this for you’ is a tough prospect in a customer-facing role.”
can create realistic deepfake photos, videos or audio to trick
Plenty of tools on the market can help banks slow the
people into forking over money. In one example of this, a
finance worker in the Hong Kong office of a multinational
firm was conned into sending $25 million to scammers who
used a deepfake video to pose as the company’s chief financial officer, Hong Kong authorities said in 2024.
Experts believe it’s a matter of time before fraud involving AI becomes a major threat to the banking industry,
though currently it makes up a relatively small proportion
of incidents. Just 15% of bank executives and directors who
participated in Bank Director’s 2025 Risk Survey indicated
that their bank or its customers had been directly impacted
by fraud involving AI or deepfake media over the prior 18
“Because of AI, [criminals] can
replicate fraud schemes much
quicker than the old school way
of doing it, where a human had
to do each one.”
months. The Deloitte Center for Financial Services estimated in 2024 that the use of generative AI could propel U.S.
Sarah Beth Felix, Palmera Consulting
THE FRAUD MENACE: PROTECTING YOUR BANK | 7