Take tour   Speak with an expert
Take tour Speak with an expert

The Evolving Landscape of Financial Crime

Lucas Chapin

Head of Data

AI is creating new financial crime typologies – and new ways of stopping them.

Introduction

The modern world of financial crime is personalized, lucrative, and – frankly – terrifying. It’s estimated that over $3 trillion dollars in illicit funds moves through the global financial system each year, meaning that crime proceeds outstrip the yearly GDP of countries like France or Canada. More discouraging still, it’s estimated that just 1% of illicit financial activity is intercepted by law enforcement agencies.

And in case things didn’t seem dire enough, let’s not forget that the world of financial crime is also perpetually changing. Remember the quaint old days of typo-ridden Nigerian Prince Scams? Those are now so far in the rear-view mirror they have more in common with Al Capone’s actual, money-laundering laundromats than with today’s financial crime playbook. Today’s financial crime looks vastly different: consider the Hong Kong financier who lost $25 million by following instructions from his “CFO” over video conference, or the harrowing stories of families who have received phone calls from supposed kidnappers demanding ransom while their children cry out in the background. 

Both are examples of AI deepfake technology, and taking a peek behind the curtain we see that the latest technological advancements fueling these crimes (and many others) are producing criminal tactics and strategies that are far more sophisticated than what was possible just a few years ago. And while it’s tempting to get caught up in the details behind each new scam or fraud typology, it’s important to note that the current wave of new criminal methodologies is the result of several macro trends coming together. These include:

  • Step changes in AI technology
  • Innovations in cryptocurrency and money movement
  • Data breaches and dark web marketplaces
  • The shift bringing all services online

Let’s break down how each of these trends plays a role in today’s financial crime landscape.

AI Technology

AI technology has improved dramatically in recent years, its increased sophistication due largely to breakthroughs in transformer architectures, generative adversarial networks (GANs), and Large Language Models (LLMs). The use of GANs and LLMs, in particular, has led to an explosion of AI-generated text, image, and video material. This AI-created content is so convincing that recent studies indicate that humans can no longer do better than random guessing when attempting to tell the difference between portrait photography and AI generated images. (Want to see how you fare? Try the test yourself.) 

It doesn’t take a great leap of imagination to see how criminals can make use of this technology for a wide range of financial crimes – and just how easy it has become for them to do so. For example, consider that using AI to replicate a person’s voice now only requires roughly a five second sample clip, something that’s easily found due to the proliferation of publicly available content available on social media. The tools required to take this sample and transform it into a convincing text-to-speech tool can cost as little as $10. The companies providing these tools (for legitimate purposes!) obviously prohibit this fraudulent use in their terms of service, but policing terms of use agreements isn’t a practical way of preventing criminals from engaging in illegal activity. Determined criminals are very good at discovering ways to circumvent user agreements and skirt past safety controls, and playbooks for jailbreaking these technologies are traded on the black market.  

But text-to-speech isn’t the only way these new AI tools are being weaponized for financial crime. Deepfake video and voice technology may be used together to emulate a person’s appearance and voice in real-time, allowing a criminal to have a “live” conversation with a victim. This approach is especially common in romance scams, but is also combined with spear phishing and social engineering in attacks on business organizations. Savvy criminals have already learned how to feed business information such as operational and personnel data into an LLM in order to give the model everything it needs to generate highly convincing social engineering messages. The easily-guessed “Hi [name] are you available? - it’s the CEO” text messages are soon to be replaced with texts (and entire conversations!) that will be much harder to dismiss as fake. 

Cryptocurrency

Like all methods of value transference, cryptocurrency exchanges are used by criminals for illicit purposes. Bitcoin is frequently used today as a placement medium for ransomware, with any stolen crypto quickly laundered through a coin mixer in an attempt to obfuscate its origins. For many criminals, this style of laundering is arguably more efficient and effective than complex traditional methods such as structuring and layering illicit funds through elaborately linked networks of shell companies. 

Criminals also often seek to take advantage of the irreversible nature of transactions made on the blockchain. Crypto exchanges have a tiny window – just a few minutes – wherein they are able to “freeze” an exchange in progress before it is finalized. That’s an extremely short window compared to traditional payment rails, such as ACH or credit. Taken together, this means that unless real-time fraud monitoring systems are able to flag the transaction as it occurs, transactions are likely to be completed, and any transferred funds will be extremely difficult to recover.

Finally, crypto’s novelty (and the corresponding lack of education surrounding it) provides a criminal opportunity of its own. Crypto is the de facto currency of choice for pig butchering scams, so named because scammers “fatten up” their victims by giving them fake “returns” on small investments in order to trick them them to investing large sums of money, at which point the scammer suddenly disappears with the money. Talented fraudsters now use the entire crypto infrastructure to their advantage in these scams, creating spoofed crypto exchanges that look legitimate but are in fact completely fraudulent. Elements of social engineering are often included in these scams as well, to the extent that a conspirator playing the role of a “trusted financial advisor” may be hired to help lure the unsuspecting victim into investing larger and larger sums of money. The scam often works to build trust and grow the total investment over the course of weeks or months before the fraud is eventually completed.

Data Breaches

We may have grown accustomed to data breaches making waves in the news and elsewhere, but the true scope of their effect is staggering. Data breaches, such as the recent AT&T leak of  approximately 73 million Americans’ private information, to us seem all too common. Frankly, it’s now statistically more than likely that your personal information (everything from your SSN to your bank account information to – yes – even your mother’s maiden name) has been involved in a data breach. 

It’s also important to remember that even if only certain pieces of data are leaked in a breach, criminals can compile a more complete identity profile by scraping data from easily accessed public sources like obituaries and social media. Where information is missing, criminals may use brute force techniques such as BIN attacks to augment information about an individual. The result? Data middlemen are able to assemble comprehensive profiles of individuals, selling this stolen information on dark web marketplaces. The practice is so widespread, with so much stolen data for sale, that oftentimes an individual profile is sold for astonishingly cheap amounts

Everything Online

While it may seem obvious, arguably the most impactful trend on this list is the gradual transition of our global economy to digital platforms. This is something that’s been happening for decades now, and while it’s brought unprecedented levels of efficiency and convenience to the world economy, it’s also produced a huge number of new financial crime attack vectors. 

Who is it affecting? More demographics than you might think. For example, if your prototypical image of a person falling for an online scam is an elderly person, you may want to think again. In 2023 the group losing more money to scams than any other wasn’t the elderly, but the 18-24 age group. Why? Because more and more scams are designed specifically for the digitally native generation. Links to pernicious malware are now bypassing email and going straight to Instagram and TikTok. Even professional social media sites like LinkedIn aren’t out of the fraudster’s reach with employment scams on the rise. An online version of the roper/mark scam, today’s version sees a potential employer reach out on LinkedIn, conduct a full interview process with a job-seeking candidate, and then conveniently “disappear” after collecting the candidate’s bank account information under the guise of setting up direct deposit.

These scams take advantage of the fact that we have normalized conducting almost all social interactions in an online format, something that the pandemic only accelerated. Employee meetings, job interviews, doctor’s visits, financial advisor meetings – these are all things we do online without a second thought, but which would have been unthinkable even a decade ago. The digital economy gives AI-powered social engineering scams the perfect playground in which to refine and exercise their destructive power. And until we develop more powerful tools for systematically determining the truthfulness/authenticity of a given interaction, it’s safe to say that the frauds and scams will continue. 

What Can Be Done?

Taken together, these trends create a harrowing picture of modern financial crime. But all is not lost! There is plenty of good work and technological innovation happening to help protect and defend the financial system. 

There are many companies out there who specialize in fighting financial crime. These companies work directly with financial institutions, who are also embracing modern technology and employing new tools (like AI) to help support their anti-fincrime efforts. At the same time, tech teams at financial institutions are beginning to do things like build with zero-trust architectures, improving defenses and dramatically beefing up security. 

Here are just a few of the ways companies are fighting back against financial crime: 

Behavioral Analytics

Advanced algorithms and machine learning are able to analyze patterns of behavior and detect anomalies indicative of financial crime, such as money laundering or fraudulent transactions. Automated detection systems that use machine learning algorithms have proven more successful than traditional rules-based methods – especially in areas like transaction monitoring.

Identity Verification and Customer Onboarding

Companies building “know your evidence” AI-based approaches to validating information can do a better and more sophisticated job of detecting deepfakes and validating documentation than humans can. AI-based document summarization tools, meanwhile, help speed up customer onboarding and due diligence practices, allowing human analysts to get at essential information more efficiently.  

Blockchain Forensics

For every story of a crypto-based theft or money laundering scheme, there is increasingly a story of law enforcement leveraging the transparency and immutability of blockchain ledger to take down criminals and bring them to justice. Companies like Chainalysis and Elliptic are providing anti-financial crime specialists with a powerful new set of tools for tracking down criminals operating within the crypto world.

Process Automation and AI-based efficiency gains

When conducting investigations into potential criminal activity, automated workflows and AI-powered analysis tools can synthesize information across a variety of data sources, allowing investigators to quickly and accurately paint a complete picture of potential criminal activity. By reducing the amount of repetitive or manual effort required to collect data and compile things like a SAR or STR, compliance teams are able to spend more time focused on the high-level activity of investigating. 

Wrap-up

At Hummingbird, we spend a lot of time talking to financial institutions about compliance work.  Naturally, there’s often a reluctance to adopt new technologies as they do require a shift in thinking about how to detect and respond to threats. 

It’s true that the complexity of AI makes it harder to understand than simple rules based algorithms. But if the simpler and more straightforward methods of detecting and addressing financial crime are no longer sufficient, then adopting new technology becomes as essential to the practice of effective risk mitigation as anything else. 

It’s incumbent on us to embrace technological progress, and to build modern AML programs capable of matching the sophistication and effort shown by fraudsters, money launderers, and other financial criminals. On a certain level, it will always be a cat and mouse game, but the same way that financial criminals are supercharging their tactics with AI and modern technology, we can achieve superhuman intelligence ourselves and fight back by embracing these tools for good.

Stay Connected

Subscribe to receive new content from Hummingbird