AI and the SEC Whistleblower Program

Tax farming is the practice of licensing tax collection to private contractors. Used heavily in ancient Rome, it’s largely fallen out of practice because of the obvious conflict of interest between the state and the contractor. Because tax farmers are primarily interested in short-term revenue, they have no problem abusing taxpayers and making things worse for them in the long term. Today, the U.S. Securities and Exchange Commission (SEC) is engaged in a modern-day version of tax farming. And the potential for abuse will grow when the farmers start using artificial intelligence.

In 2009, after Bernie Madoff’s $65 billion Ponzi scheme was exposed, Congress authorized the SEC to award bounties from civil penalties recovered from securities law violators. It worked in a big way. In 2012, when the program started, the agency received more than 3,000 tips. By 2020, it had more than doubled, and it more than doubled again by 2023. The SEC now receives more than 50 tips per day, and the program has paid out a staggering $2 billion in bounty awards. According to the agency’s 2023 financial report, the SEC paid out nearly $600 million to whistleblowers last year.

The appeal of the whistleblower program is that it alerts the SEC to violations it may not otherwise uncover, without any additional staff. And since payouts are a percentage of fines collected, it costs the government little to implement.

Unfortunately, the program has resulted in a new industry of private de facto regulatory enforcers. Legal scholar Alexander Platt has shown how the SEC’s whistleblower program has effectively privatized a huge portion of financial regulatory enforcement. There is a role for publicly sourced information in securities regulatory enforcement, just as there has been in litigation for antitrust and other areas of the law. But the SEC program, and a similar one at the U.S. Commodity Futures Trading Commission, has created a market distortion replete with perverse incentives. Like the tax farmers of history, the interests of the whistleblowers don’t match those of the government.

First, while the blockbuster awards paid out to whistleblowers draw attention to the SEC’s successes, they obscure the fact that its staffing level has slightly declined during a period of tremendous market growth. In one case, the SEC’s largest ever, it paid $279 million to an individual whistleblower. That single award was nearly one-third of the funding of the SEC’s entire enforcement division last year. Congress gets to pat itself on the back for spinning up a program that pays for itself (by law, the SEC awards 10 to 30 percent of their penalty collections over $1 million to qualifying whistleblowers), when it should be talking about whether or not it’s given the agency enough resources to fulfill its mission to “maintain fair, orderly, and efficient markets.”

Second, while the stated purpose of the whistleblower program is to incentivize individuals to come forward with information about potential violations of securities law, this hasn’t actually led to increases in enforcement actions. Instead of legitimate whistleblowers bringing the most credible information to the SEC, the agency now seems to be deluged by tips that are not highly actionable.

But the biggest problem is that uncovering corporate malfeasance is now a legitimate business model, resulting in powerful firms and misaligned incentives. A single law practice led by former SEC assistant director Jordan Thomas captured about 20 percent of all the SEC’s whistleblower awards through 2022, at which point Thomas left to open up a new firm focused exclusively on whistleblowers. We can admire Thomas and his team’s impact on making those guilty of white-collar crimes pay, and also question whether hundreds of millions of dollars of penalties should be funneled through the hands of an SEC insider turned for-profit business mogul.

Whistleblower tips can be used as weapons of corporate warfare. SEC whistleblower complaints are not required to come from inside a company, or even to rely on insider information. They can be filed on the basis of public data, as long as the whistleblower brings original analysis. Companies might dig up dirt on their competitors and submit tips to the SEC. Ransomware groups have used the threat of SEC whistleblower tips as a tactic to pressure the companies they’ve infiltrated into paying ransoms.

The rise of whistleblower firms could lead to them taking particular “assignments” for a fee. Can a company hire one of these firms to investigate its competitors? Can an industry lobbying group under scrutiny (perhaps in cryptocurrencies) pay firms to look at other industries instead and tie up SEC resources? When a firm finds a potential regulatory violation, do they approach the company at fault and offer to cease their research for a “kill fee”? The lack of transparency and accountability of the program means that the whistleblowing firms can get away with practices like these, which would be wholly unacceptable if perpetrated by the SEC itself.

Whistleblowing firms can also use the information they uncover to guide market investments by activist short sellers. Since 2006, the investigative reporting site Sharesleuth claims to have tanked dozens of stocks and instigated at least eight SEC cases against companies in pharma, energy, logistics, and other industries, all after its investors shorted the stocks in question. More recently, a new investigative reporting site called Hunterbrook Media and partner hedge fund Hunterbrook Capital, have churned out 18 investigative reports in their first five months of operation and disclosed short sales and other actions alongside each. In at least one report, Hunterbrook says they filed an SEC whistleblower tip.

Short sellers carry an important disciplining function in markets. But combined with whistleblower awards, the same profit-hungry incentives can emerge. Properly staffed regulatory agencies don’t have the same potential pitfalls.

AI will affect every aspect of this dynamic. AI’s ability to extract information from large document troves will help whistleblowers provide more information to the SEC faster, lowering the bar for reporting potential violations and opening a floodgate of new tips. Right now, there is no cost to the whistleblower to report minor or frivolous claims; there is only cost to the SEC. While AI automation will also help SEC staff process tips more efficiently, it could exponentially increase the number of tips the agency has to deal with, further decreasing the efficiency of the program.

AI could be a triple windfall for those law firms engaged in this business: lowering their costs, increasing their scale, and increasing the SEC’s reliance on a few seasoned, trusted firms. The SEC already, as Platt documented, relies on a few firms to prioritize their investigative agenda. Experienced firms like Thomas’s might wield AI automation to the greatest advantage. SEC staff struggling to keep pace with tips might have less capacity to look beyond the ones seemingly pre-vetted by familiar sources.

But the real effects will be on the conflicts of interest between whistleblowing firms and the SEC. The ability to automate whistleblower reporting will open new competitive strategies that could disrupt business practices and market dynamics.

An AI-assisted data analyst could dig up potential violations faster, for a greater scale of competitor firms, and consider a greater scope of potential violations than any unassisted human could. The AI doesn’t have to be that smart to be effective here. Complaints are not required to be accurate; claims based on insufficient evidence could be filed against competitors, at scale.

Even more cynically, firms might use AI to help cover up their own violations. If a company can deluge the SEC with legitimate, if minor, tips about potential wrongdoing throughout the industry, it might lower the chances that the agency will get around to investigating the company’s own liabilities. Some companies might even use the strategy of submitting minor claims about their own conduct to obscure more significant claims the SEC might otherwise focus on.

Many of these ideas are not so new. There are decades of precedent for using algorithms to detect fraudulent financial activity, with lots of current-day application of the latest large language models and other AI tools. In 2019, legal scholar Dimitrios Kafteranis, research coordinator for the European Whistleblowing Institute, proposed using AI to automate corporate whistleblowing.

And not all the impacts specific to AI are bad. The most optimistic possible outcome is that AI will allow a broader base of potential tipsters to file, providing assistive support that levels the playing field for the little guy.

But more realistically, AI will supercharge the for-profit whistleblowing industry. The risks remain as long as submitting whistleblower complaints to the SEC is a viable business model. Like tax farming, the interests of the institutional whistleblower diverge from the interests of the state, and no amount of tweaking around the edges will make it otherwise.

Ultimately, AI is not the cause of or solution to the problems created by the runaway growth of the SEC whistleblower program. But it should give policymakers pause to consider the incentive structure that such programs create, and to reconsider the balance of public and private ownership of regulatory enforcement.

This essay was written with Nathan E. Sanders, and originally appeared in The American Prospect.

Posted on October 21, 2024 at 7:09 AM11 Comments

Comments

Stephen October 21, 2024 9:49 PM

What am I missing here? Rapacious capitalists are dragooned to surface malfeasance by other rapacious capitalists. And they only get the big from the fines that accrue to the regulator. And their actions are transparent to the public.

My analogy would be fathers of boys versus fathers of girls. If I’m a ne’er-do-well and I’m interested in lax regulation from a monolithic federal entity, I’m the father of a boy – I have to worry about precisely one dck and what its owner intends to do with it. I can concentrate my influence on the regulator. If I’m face with a bevy of regulatory mercenaries, I’m the father of a gilrl – I need to worry about ALL the dcks. Sure I might pay one to look the other way, but I can’t stop all of them. Someone will claim the bounty out of righteous indignation (how dare you bribe me) or selfish interest (how dare you bribe me with so little).

AI spam would seem to have a simple solution – register participants and charge a nominal, non-refundable up-front fee.

This is one case where I think the invisible hand is holding the right solution.

ResearcherZero October 22, 2024 1:36 AM

Many modern devices also contain third-party AI products. The list of information that a manufacturer, such as ASUS for example, would like to collect is quite detailed. The amount of data that third-party AI security products have access to and collect is considerable.

In order to upgrade firmware, companies are increasingly asking users to agree to data collection from the device in question, or future firmware updates may not be available.

We have indeed entered the era of mass spying.

Network segmentation can help block such AI services and data collection, but automatic firmware updates will not work if these devices do not have a direct internet connection.
Neither will the AI security update, if it fails to detect a direct internet connection.

These products can also not upload your data if they cannot resolve the connection. The price of privacy is that of security, if the product delivers additional security.

ResearcherZero October 22, 2024 2:12 AM

There are rules that prohibit “knowingly engaging in any joint research or technology licensing effort with a foreign entity of concern”. They apply to third-parties and those situated in overseas countries who might work on products governed by these rules.

ResearcherZero October 22, 2024 2:36 AM

I received a couple of rewards from a foreign entity in the form of small pieces of lead. It was not a reward I wanted, but at least they put more thought into than the government here, who delivered no reward and a very underwhelming response to the theft of designs from highly sensitive military projects, and little concern regarding illegal monitoring of employees working on those designs by undeclared agents working for said foreign entity.

But then had our allies known what was taking place, they may not have been impressed. Its not like any of our bozos and bureaucrats knew that those projects would be a success. At least for our adversaries which used the designs to successfully complete and deploy them.

Sean October 22, 2024 7:06 AM

When you make a metric the goal, you get that being gamed. Make payout so easy, and not look too carefully, and you get the same.

Armando October 23, 2024 2:04 PM

The article raises legitimate concerns about the impact of AI on the programme. As AI technology advances, it could further exacerbate existing problems, potentially leading to a flood of frivolous tips and further undermining the effectiveness of the SEC’s efforts to enforce the law.

For example, the discovery of gold sparked a massive gold rush in the American West in the 19th century. Prospectors flocked to the region, hoping to strike it rich. While many found nothing, a few lucky individuals became immensely wealthy.

In a similar way, the SEC’s whistleblower programme has created a modern-day gold rush. With the potential for huge financial rewards, individuals and companies are incentivised to find and report potential securities law violations. While some whistleblowers have been instrumental in exposing corporate wrongdoing, others may be motivated primarily by personal gain.
Like the gold rush, the programme has attracted a wide range of participants. Some are genuinely concerned about corporate wrongdoing, while others are simply looking to cash in. The potential for abuse and exploitation is high, as evidenced by the rise of whistleblower firms and the use of AI to automate tip generation.

Ultimately, like the gold rush, the SEC whistleblower programme is a double-edged sword. While it can be a valuable tool for exposing wrongdoing, there is also a risk that it will be exploited and abused.

Ebenezer Scrooge October 23, 2024 4:05 PM

The argument for bonuses is straightforward. If a professional ever blows the whistle, they will never get hired in any professional capacity by any employer ever, no matter the merits of their claim. (This is also true for plaintiffs in discrimination suits.) The SEC does not have any significant supervisory power, and the record of bank supervisors is not very encouraging. The Wells Fargo dead names scandal was uncovered by the press, not the supervisors empowered to pore over Wells Fargo’s records. To add to this, courts are already wired in favor of defendants, in burden of proof, weak discovery sanctions, and (often enough) vegetarian damages awards.

So what can the SEC do? Grossly underenforce?

Clive Robinson October 24, 2024 10:01 AM

@ Bruce,

What is with the holding / censorship of any comment about LLM / ML system failings. Worse the failings of certain humans who will use them for what society regards as “bad”?

ratwithahat October 25, 2024 2:19 PM

@Stephen

The issue is that there are too many reports being made. The system seems to only really motivates people to whistleblow as much as possible without considering how much of an impact it really has.

In short, there are a lot of “false positives” and it decreases the agency’s efficiency since they can’t sort the good stuff from the (increasing) crap.

Wim Ton November 15, 2024 5:47 PM

A similar process is running in Germany for decades. Lawyers use a crawler to scan websites, and if they are non conforming, e.g. a reference to data protection policy, or the contact address is missing, they send a mail: “You website is breaking the law, but if you pay us 500 Euros, we will not sue you”.

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.