- Thomas Smith
Technical solutions and their limits
The internet is awash with websites advertising the ability to “find anyone” and “instant background checks.” Of course, the marketing language there is obviously geared towards nosy teenagers and nervous dating app users. But in the enterprise space you see the same type of breathless language promising to “revolutionize due diligence,” detect fraud with the push of a button, and save you literally oodles of money on compliance and legal costs.
It’s true that the technical aspect of our profession has advanced in keeping with the rest of the American economy. There is more data available to private investigators than ever before – though only marginally more than is available to the public at large, a significant volume in itself. Take for example the various tools that parse the tremendous volumes of location data outgassed by the average person’s cell phone. Information that is absorbed or purchased by data brokers, resold to various middlemen and software engineers, and finally made available to us. Fortunately for the concerned citizen, that doesn’t exactly turn an eligible PI shop into a mini-NSA, but the functionality is analogous (not totally surprising considering the tools were largely built by ex-G-men).
Technical solutions like these are evidently useful, as lead generators or auxiliaries to other information gathering methods in litigation support cases, or internal investigations. Tracking the movements of a subject is something that private investigators do routinely, and software that makes that easier, less obtrusive, and safer is a welcome development. As is software that assists with financial analysis, document review, or scrutinizing social networks. I use variants of all of these types of tools in my work, and find them valuable – though they would be of limited use to someone who isn’t experienced in spotting the false positives and other little lies in the data that you learn to pick out like a jaded lover.
I take issue with software providers that advertise tools that can replace the human brain – especially an investigative brain. I won’t claim that it’s impossible to do this – one day we may well be confronted with a silicon Sherlock. But I haven’t met him yet amid the dozens of products that are offered to firms like mine and our clients.
The compliance world is particularly awash with software solutions that claim to automate functions previously conducted by staff investigators or lawyers. Software that does things like automatically rank the “risk” of a vendor or business partner, or screens third parties against media databases. Not necessarily unhelpful functions in themselves (though the ranking of a nebulous quality like risk poses problems of its own entirely apart from this subject). Indeed, any sales representative would counter that the tools are designed merely to help human analysts target their analysis. The problem is the human tendency to become reliant on software tools while suspending critical thinking or even common sense.
I have a suspicion – as yet untested – that overreliance on automation is partly to blame for some recent examples of companies getting dinged for FCPA or AML violations despite supposedly rigorous compliance processes. Overreliance on automation has been studied in detail in many fields, including healthcare and aviation. To my knowledge, no one has looked into the phenomenon as it relates to anti-corruption, but I suspect that compliance professionals are no more immune to it than doctors or pilots. My own inquiries into the subject are ongoing. Anecdotally though, many of these same companies, as part of efforts to improve internal controls after a violation, invest in additional automation with the encouragement and approval of regulatory authorities.
However, only a poor craftsman blames his tools. There are countless human elements that go into compliance failures at the corporate level. Some of these are probably insurmountable, software or no software. For instance, planning and policy development has its limitations especially when it comes to anticipating the actions of malevolent actors. My own experience suggests that the type of person who designs internal controls – accounting, compliance, etc – has a hard time anticipating the actions of his adversary. Compliance procedures are based on analyses of past failures – a reactive exercise by definition. That isn’t a black mark against the architects of compliance policies. Criminals engaged in fraud are often creative, intelligent people – they design novel schemes that are designed to take advantage of the honest, credulous, and well-meaning.
There were efforts made to counter this in recent memory. Many of my colleagues in the investigations business have moved through various banks, investment houses, and Fortune 500 corporations that are the traditional client base of our profession and the related compliance consulting industry. Grizzled ex-cops and PIs were hired at least partially to help corporate actors anticipate the actions of malefactors, as well as perform more general detective work. Many of these hiring programs have been reduced or suspended over the years. Presumably, the costs were found to outweigh the benefits – technical experts are expensive to retain on salary in any profession.
Enter the software developers, to fill needed gaps left by the departure of professional staffers. Cost control in investigations, compliance, and law is a real problem – I can’t deny that. It’s an issue that we think about frequently at our firm, and which we try to build into our professional philosophy and proposals. It’s a problem that companies have not yet solved, and software and automation will undoubtedly form parts of the solution.
In many ways it already has. Banks that are required to file SARs make extensive use of automation to detect potential money laundering, and much time and money has been spent on reducing the number of false positives that necessarily arise in transaction monitoring. Elsewhere, companies automate the analysis of things like payments made to third parties, expense account debits, and all of the other little numbers that human activity leaves behind like footprints.
In the majority of cases, this type of automated analysis – used to flag suspicious transactions for further review, or highlight unusual payments – is helpful and innocent. If a data scientist told you that these numbers don’t lie, he would be largely correct. But a private investigator must counter that people sometimes lie about numbers.
My colleagues who are data scientists will find fault with this, countering that detecting false statements is within the capabilities of their analysis. In fact, data scientists and private investigators have been arguing about the proper role and limits of one another’s professions for as long as the former has existed as a trade (our trade is of course much older). To them, I’m certain that our emphasis on conversation, personality, and patterns of behavior as a means of detecting crime or malfeasance seem archaic. They would – they are. But to my knowledge, there is no tool yet developed that can accurately decipher a complicated lie, unravel a scheme, or identify a malefactor with evidence.