Who We Are

We are your guide to timely, fact-based insights and actionable solutions. We connect the dots between the research and your business objectives.

Who We Work With

Some of the most iconic brands in the world have trusted us since 1983 and we are proud to have clients that continue to work with us after 30 years of successful business together.

What We Do

We deliver on our promise. Our promise to provide the highest quality marketing research insights with a reliable team of industry experts that you can count on every step of the way.

How We Do It

We care about the quality of our deliverables. We care about the impact that our insights and solutions have on your organization. We care about our relationships with our clients and with each other.

Let’s Connect

We know that trust must be built between us over time. So, let’s get started.

Privacy Policy
Copyright © 2023 KS&R, Inc.
All Rights Reserved
ABA Banking Journal Features Article on Generative AI: Threat and Solution for Financial Crime? Hero Image

ABA Banking Journal Features Article on Generative AI: Threat and Solution for Financial Crime?


ABA Banking Journal, an online publication providing insights and analysis for banks of all sizes has featured Alex Capella and Christopher Reimann‘s article, “Generative Artificial Intelligence: Threat and Solution for Financial Crime?”


Generative artificial intelligence: Threat and solution for financial crime?

AI-generated messages and images can create realistic impersonations, which enable criminals to launch highly effective frauds at scale.

The threat landscape for banks has changed dramatically since the start of 2023. In January, as part of a research project, I spoke to dozens of financial crime compliance officers at leading U.S. banks, and they told me that real-time digital payments, cybercrime and fraud would be the top financial crime threats for 2023/24. No one mentioned generative AI or ChatGPT.

But it is now a top threat. ChatGPT use has grown significantly since November 2022. Given the extraordinary capabilities of ChatGPT and other generative AI platforms, we can assume that criminals are already using it to:

  • Create very real-looking fake profiles, documents, and transactions that get past the best trained compliance person.
  • Develop bots and malware to commit cybercrime.
  • Perpetrate scams to obtain people’s bank account information.

And likely, that’s just the tip of the iceberg, because one of ChatGPT’s most valuable capabilities for fraudsters and cybercriminals is the ability to make the fake look real.

For example, AI-generated messages can create highly realistic impersonations, which enables criminals to launch highly effective frauds at scale. There have already been noted criminals using ChatGPT to create legitimate-looking social media personas that gain users’ confidence in order to steal data. FTC chair Lina Kahn has warned that ChatGPT could “turbocharge” fraud and scams, making it more difficult for compliance teams to distinguish criminal from legitimate transactions.

The strong link between fraud and financial crime

For many years, fraud and financial crime were treated separately. Fraud was associated with payments, while financial crime was associated with money laundering. But in recent years, banks have integrated these functions for more holistic and effective investigations. As one financial crime compliance officer said to me: “Fraud is the criminal act, and money laundering is the moving of money from that act.”