Tuesday, May 5, 2015

Should Algorithm Audits be mandated for HFT firms?

Was heading into work on train and came across WSJ's op-ed piece on the need for regulation around algorithms involved in trading. The article mentions how the regulators have not done much since the Flash Crash of 2010.

What is the Flash Crash of 2010?

As noted in the piece, "flash crash hit on the afternoon of May 6, 2010, as riots in Athens and a European debt crisis weighed on markets. In about eight minutes the Dow Jones Industrial Average fell 700 points before rebounding." 

The op-ed goes on to dismiss the "official" explanation (i.e. a large hedge placed by a US firm and financial shenanigans of UK based day trader) and states: "More important, they say, is the role of high-frequency firms, which use hard-to-monitor algorithms to trade large amounts of stock in fractions of seconds. If they trade erratically, the market can come unglued, as happened in the flash crash."

The article notes that the SEC has been exploring the mandating disclosure requirements and controls on firms that use algorithms. However, the article also quotes a number of regulators who say they don't have enough funds to keep pace with the firms. 

Before I go back down memory lane, it is also worth noting that there are other experts who hold that algorithms - from a privacy perspective - need to be regulated. Bruce Schneier, a well known information security expert who helped review the Snowden documents, in his latest book, Data and Goliath (see clip below for a summary), also calls for "auditing algorithms for fairness".  He also notes that such audits don't need to make the algorithms public, which is it the same way financial statements of public companies are audited today. This keeps a balance between confidentiality and public confidence in the company's use of our data.

So is it time for auditing algorithms through an "AlgoTrust" offering?

As I noted on my reflections on "Big Data: A Revolution That Will Transform How We Live, Work, and Think": 

"[H]ow would you go about auditing an algo? Although auditors lack the technical skills of algoritmists, it doesn't prevent them from auditing algorithms. The WebTrust for Certification Authorities (WebTrust for CAs) could be a model where assurance practitioners develop a standard in conjunction with algorithmists and enable audits to be performed against the standard. Why is WebTrust for CAs a model? WebTrust for CAs is a technical standard where an audit firm would "assess the adequacy and effectiveness of the controls employed by Certification Authorities (CAs)". That is, although the cryptographic key generation process is something that goes beyond the technical discipline of a regular CPA, it did not prevent the assurance firms from issuing an opinion."

I also noted:

"some of the ground work for such a service is already established. Fundamentally, an algorithm takes data inputs, processes it and then delivers a certain output or decision. Therefore, one aspect of such a service is to understand whether the algo has "processing integrity" (i.e. as the authors put it, to attest to the "accuracy or validity of big-data predictions"), which is something the profession established a while back through its SysTrust offering."

What I saw to be the challenge at the time I penned that blog post is market demand for this type of service. The answer appears to be that SEC could mandate such audits and leverage the CPA firms the same way they do for financial audits. However, instead of rendering opinion on the financials, such audit firms would render an AlgoTrust opinion on the algorithms to ensure that they are in-line with Generally Accepted Algorithmic Principles instead of Generally Accepted Accounting Principles (sorry I couldn't resist!).

Beyond WebTrust for Certification Authorities, companies are currently leveraging SysTrust which has been subsumed into the SOC 2 and SOC 3 audit reports. For example, Salesforce.com gets an audit opinion that provides reasonable assurance that its systems are secure, available and that it maintains confidentiality of the information they are provided with.

The AlgoTrust standard should address issues such as the ones raised in WSJ (i.e. as it relates to trading algos) as well ensuring the preservation of privacy. But it should not stop there. In the original post, Chris Steiner explains how algos are invading all parts of life, including things like robot pharmacists.

We have at least three experts from three different fields: finance, data, and information security that all see the value in auditing algorithms. If the CPAs don't take the lead on this, who will? As Bruce Schneier notes it won't be easy, but it is something that will eventually be tackled by either the CPA profession or someone else. 

No comments: