Compliance professionals working to prevent financial crimes are losing their faith in AI to solve more problems than it causes, as recent survey data has found a significant drop in those saying the technology has positively impacted their programs between 2023 and 2024.
This is one of the findings of a survey issued by financial and risk management solutions provider Kroll in its most recent Financial Crime Report. The report found that, as adoption of AI and machine learning advances, only 20% of respondents now exploring these tools report a “very positive impact” on their financial crime compliance frameworks. In contrast, the 2023 survey found 37% said the same thing. Dan Rice, managing director of cyber risk at Kroll, said that professionals found that the current set of solutions simply was not up to the task required of them.
“A lot of promises were made in 2023, and have not come to pass. The short answer is that AI was never going to solve the problems it was sold to solve. Many financial institutions and large companies have data problems, and if the data isn’t great, the AI tends not to work well. There were many leaps in logic that suggested AI would fix the data problems and, consequently, many of our other problems. However, that’s not the case and won’t be the case. There’s still a lot of hard work below the surface needed to get this right. Many companies rushed into implementation of AI without proper planning, and now the focus is on developing the right strategy, governance and documentation to ensure compliance,” he said in an email.
At the same time, 71% of respondents expect financial crime risks to increase this year, yet only 23% believe their organization’s compliance program is “very effective.” This is at least partially due to lack of technology investment, as only 30% say their organization’s financial crime compliance program is sufficient in these respects, or weak governance, as only 29% strongly agree that their organization has a robust governance infrastructure for overseeing financial crime.
AI plays a large role in this perception of risk, as 61% cited the increased use of AI by criminals as a leading catalyst for risk exposure in the coming year, outdone only by general cybersecurity risks at 68% (which also is increasingly driven by AI). Overall, there seems to be a divide in whether AI ultimately is more boon or burden. While 57% believe AI developments will benefit their financial crime compliance programs, while 49% agree AI poses a significant risk to compliance.
The drop in those who say AI has made a very positive impact stands in contrast to other surveys which show AI enthusiasm growing generally. For instance, a recent survey from practice management solutions provider Karbon showed that the proportion of those excited about AI went from 41% to 63% for firm owners and 26% to 40% among individual contributors and staff in technology, operations and administrative positions. Meanwhile, a report from Wolters Kluwer shows rising AI implementations, growing 34% in just one year. And late last year, an EY poll found AI trust doing a 180, going from 85% saying generative AI will not drive increased effectiveness and efficiency over the next three years in 2023, to 87% saying it will just one year later.
This difference could come down to who was polled. Kroll surveyed 600+ worldwide respondents that included CEOs, chief compliance officers, general counsel, chief risk officers and other financial crime compliance professionals. Half work in the financial services industry and the remainder are from other regulated industries, including accountancy, insurance, real estate, and legal services. Poll respondents came from the U.S. and UK, Western Europe (France, Germany, Ireland, Italy, Spain and Switzerland), Scandinavia (Norway and Sweden), Asia Pacific (Australia, India and Japan), Hong Kong, Singapore and the Middle East/Africa (United Arab Emirates and South Africa), as well as offshore financial centers—the British Virgin Islands, Cayman Islands and Jersey.