Showing posts with label SysTrust. Show all posts
Showing posts with label SysTrust. Show all posts

Monday, September 7, 2015

BNY Mellon Software Glitch: Time to make SysTrust mandatory?

As was widely reported in the business press, BNY Mellon experienced a technical glitch that affected its ability to price mutual funds accurately. Based on the press release from one of the affected funds, the problems started on Monday August 24th, where one of BNY Mellon's system "InvestOne" managed by SunGard was pricing about 800 mutual funds inaccurately.

So what was the cause of this fiasco?

According to CNN, "BNY Mellon outage occurred after a SunGard accounting system it uses became "corrupted" following an upgrade. A back-up also failed."

Normally, this type of thing will force the party experiencing the breach intense scrutiny over what went wrong. However, as I went through the timeline posted by the company, I found (reading between the lines) that they did a number of things right, such as:
That being said, there is always room for improvement. When I was reflecting on this, I speculated that this was another case of inadequate testing of the system upgrade. However, according to SunGard, this was not the case. As they noted on their website:

"The issue appears to have been caused by an unforeseen complication resulting from an operating system change performed by SunGard on Saturday, August 22nd. This maintenance was successfully performed in a test environment, per our standard operating procedure, and then replicated in SunGard’s U.S. production environment for BNY Mellon. This change had also been previously implemented, without any issues, in other InvestOne environments. Unfortunately, in the process of applying this change to the SunGard production environment of InvestOne supporting BNY Mellon’s U.S. fund accounting clients, that environment became corrupted. Additionally, the back-up environment hosted by SunGard, supporting BNY Mellon’s U.S. fund accounting clients, was concurrently corrupted, thus impeding automatic failover. Because of the unusual nature of the event, we are confident this was an isolated incident due to the physical/logical system environment and not an application issue with InvestOne itself."

Given my background as a CA, CPA and CISA, I have always thought it is an odd contradiction that we expect infrastructure (road, dams, bridges, etc.) to be certified by engineers to be in working order (key word is expect, as John Oliver notes in the video below, this is not exactly up to snuff!), but do not have the same expectations for the technology that runs the Information Age.

And that's where I have always proposed that it is necessary to have a framework like SysTrust (now SOC2 and SOC3) in place that requires companies to ensure that their systems are reliable: secure, available, and able to process information without messing it up.

Based on the experience between SunGard and BNY Mellon, I think it actually proves the case. Although companies, like SunGard, likely have such controls in place it is beneficial to others to have a second set of eyes on those controls, ensuring that they are in place, are designed effectively and are operating effectively. The reason is that with such mandatory audits in place, it will allow for the circulation of best practices through such audits. This occurs in the financial auditing world through "management letter points".

One other area that we should explore is the total impact of this error, as it will give insights into the "total impact of failed IT controls". This will be the topic of the next blogpost.



Tuesday, May 5, 2015

Should Algorithm Audits be mandated for HFT firms?

Was heading into work on train and came across WSJ's op-ed piece on the need for regulation around algorithms involved in trading. The article mentions how the regulators have not done much since the Flash Crash of 2010.

What is the Flash Crash of 2010?

As noted in the piece, "flash crash hit on the afternoon of May 6, 2010, as riots in Athens and a European debt crisis weighed on markets. In about eight minutes the Dow Jones Industrial Average fell 700 points before rebounding." 

The op-ed goes on to dismiss the "official" explanation (i.e. a large hedge placed by a US firm and financial shenanigans of UK based day trader) and states: "More important, they say, is the role of high-frequency firms, which use hard-to-monitor algorithms to trade large amounts of stock in fractions of seconds. If they trade erratically, the market can come unglued, as happened in the flash crash."

The article notes that the SEC has been exploring the mandating disclosure requirements and controls on firms that use algorithms. However, the article also quotes a number of regulators who say they don't have enough funds to keep pace with the firms. 

Before I go back down memory lane, it is also worth noting that there are other experts who hold that algorithms - from a privacy perspective - need to be regulated. Bruce Schneier, a well known information security expert who helped review the Snowden documents, in his latest book, Data and Goliath (see clip below for a summary), also calls for "auditing algorithms for fairness".  He also notes that such audits don't need to make the algorithms public, which is it the same way financial statements of public companies are audited today. This keeps a balance between confidentiality and public confidence in the company's use of our data.


So is it time for auditing algorithms through an "AlgoTrust" offering?

As I noted on my reflections on "Big Data: A Revolution That Will Transform How We Live, Work, and Think": 

"[H]ow would you go about auditing an algo? Although auditors lack the technical skills of algoritmists, it doesn't prevent them from auditing algorithms. The WebTrust for Certification Authorities (WebTrust for CAs) could be a model where assurance practitioners develop a standard in conjunction with algorithmists and enable audits to be performed against the standard. Why is WebTrust for CAs a model? WebTrust for CAs is a technical standard where an audit firm would "assess the adequacy and effectiveness of the controls employed by Certification Authorities (CAs)". That is, although the cryptographic key generation process is something that goes beyond the technical discipline of a regular CPA, it did not prevent the assurance firms from issuing an opinion."

I also noted:

"some of the ground work for such a service is already established. Fundamentally, an algorithm takes data inputs, processes it and then delivers a certain output or decision. Therefore, one aspect of such a service is to understand whether the algo has "processing integrity" (i.e. as the authors put it, to attest to the "accuracy or validity of big-data predictions"), which is something the profession established a while back through its SysTrust offering."

What I saw to be the challenge at the time I penned that blog post is market demand for this type of service. The answer appears to be that SEC could mandate such audits and leverage the CPA firms the same way they do for financial audits. However, instead of rendering opinion on the financials, such audit firms would render an AlgoTrust opinion on the algorithms to ensure that they are in-line with Generally Accepted Algorithmic Principles instead of Generally Accepted Accounting Principles (sorry I couldn't resist!).

Beyond WebTrust for Certification Authorities, companies are currently leveraging SysTrust which has been subsumed into the SOC 2 and SOC 3 audit reports. For example, Salesforce.com gets an audit opinion that provides reasonable assurance that its systems are secure, available and that it maintains confidentiality of the information they are provided with.

The AlgoTrust standard should address issues such as the ones raised in WSJ (i.e. as it relates to trading algos) as well ensuring the preservation of privacy. But it should not stop there. In the original post, Chris Steiner explains how algos are invading all parts of life, including things like robot pharmacists.

We have at least three experts from three different fields: finance, data, and information security that all see the value in auditing algorithms. If the CPAs don't take the lead on this, who will? As Bruce Schneier notes it won't be easy, but it is something that will eventually be tackled by either the CPA profession or someone else. 


Monday, June 16, 2014

Auditing the Algorithm: Is it time for AlgoTrust?

This is the third instalment of a multi-part exploration of the audit, assurance, compliance and related concepts brought up in the book,  Big Data: A Revolution That Will Transform How We Live, Work, and Think (the book is also available as an audiobook and hey while I am at it, here's the link to the e-book ).  In the last two posts we explored the more tactical examples of how big data can assist auditors in executing audits resulting in a more efficient and effective audit. The book, however, also examines the societal implications of big data. In this instalment, we look explore the role of the algorithmist.

Why do we need to audit the "secret sauce"?
When it comes to big data analytics, the decisions and conclusions the analyst will make hinges greatly on the underlying actual algorithm.  Consequently, as big data analytics become more and more part of the drivers of actions in companies and societal institutions (e.g. schools, government, non-profit organizations, etc.), the more dependent society becomes on the "secret sauce" that powers these analytics. The term "secret sauce" is quite apt because it highlights the underlying technical opaqueness that is commonplace with such things: the common person likely will not be able to understand how the big data analytic arrived at a specific conclusion. We discussed this in our previous post as the challenge of explainability, but the nuance here is that is how do you explain algorithms to external parties, such as customers, suppliers, and others.

To be sure this is not the only book  that points to the importance of the role of algorithms in society. Another example is "Automate This: How Algorithms Came to Rule Our World" by Chris Steiner, which (as you can see by the title) explains how algorithms are currently dominating our society. The book bring ups common examples the "flash crash" and the role that "algos" are playing on Wall Street in the banking sector as well as how NASA used these alogrithms to assess personality types for its flight missions. It also goes into the arts. For example, it discusses how there's an algorithm that can predict the next hit song and hit screenplay as well as how algorithms can generate classical music that impresses aficionados - until they find out it is an algorithm that generated it! The author, Chris Steiner, discusses this trend in the follow TedX talk:



So what Mayer-Schönberger and Cukier suggest is the need for a new profession which they term as "algorithmists". According to them:

"These new professionals would be experts in the areas of computer science, mathematics, and statistics; they would act as reviewers of big-data analyses and predictions. Algorithmists would take a vow of impartiality and confidentiality, much as accountants and certain other professionals do now. They would evaluate the selection of data sources, the choice of analytical and predictive tools, including algorithms and models, and the interpretation of results. In the event of a dispute, they would have access to the algorithms, statistical approaches, and datasets that produced a given decision."

The also extrapolate this thinking to an "external algorithmist": who would "act as impartial auditors to review the accuracy or validity of big-data predictions whenever the government required it, such as under court order or regulation. They also can take on big-data companies as clients, performing audits for firms that wanted expert support. And they may certify the soundness of big-data applications like anti-fraud techniques or stock-trading systems. Finally, external algorithmists are prepared to consult with government agencies on how best to use big data in the public sector.

As in medicine, law, and other occupations, we envision that this new profession regulates itself with a code of conduct. The algorithmists’ impartiality, confidentiality, competence, and professionalism is enforced by tough liability rules; if they failed to adhere to these standards, they’d be open to lawsuits. They can also be called on to serve as expert witnesses in trials, or to act as “court masters”, which are experts appointed by judges to assist them in technical matters on particularly complex cases.

Moreover, people who believe they’ve been harmed by big-data predictions—a patient rejected for surgery, an inmate denied parole, a loan applicant denied a mortgage—can look to algorithmists much as they already look to lawyers for help in understanding and appealing those decisions."

They also envision such professionals would work also work internally within companies, much the way internal auditors do today.

WebTrust for Certification Authorities: A model for AlgoTrust?
The authors bring up a good point: how would you go about auditing an algo? Although auditors lack the technical skills of algoritmists, it doesn't prevent them from auditing algorithms. The WebTrust for Certification Authorities (WebTrust for CAs) could be a model where assurance practitioners develop a standard in conjunction with algorithmists and enable audits to be performed against the standard. Why is WebTrust for CAs a model? WebTrust for CAs is a technical standard where an audit firm would "assess the adequacy and effectiveness of the controls employed by Certification Authorities (CAs)". That is, although the cryptographic key generation process is something that goes beyond the technical discipline of a regular CPA, it did not prevent the assurance firms from issuing an opinion.

So is it time for CPA Canada and the AICPA to put together a draft of "AlgoTrust"?

Maybe.

Although the commercial viability for such a service would be hard to predict, it would help at least start the discussion around of how society can achieve the outcomes Mayer-Schönberger and Cukier describe above. Furthermore, some of the ground work for such a service is already established. Fundamentally, an algorithm takes data inputs, processes it and then delivers a certain output or decision. Therefore, one aspect of such a service is to understand whether the algo has "processing integrity" (i.e. as the authors put it, to attest to the "accuracy or validity of big-data predictions"), which is something the profession established a while back through its SysTrust offering. To be sure this framework would have to be adapted. For example, algos are used to make decisions so there needs to be some thinking around how we would identify materiality in terms of  total number of "wrong" decisions as well as defining "wrong" in an objective and is auditable manner.

AlgoTrust, as a concept, illustrates not only a new area where auditors can move its assurance skill set into an emerging area but also how the profession can add thought leadership around the issue of dealing with opaqueness of algorithms - just as it did with financial statements nearly a century ago.