This is the fourth and final instalment of a multi-part exploration of the audit, assurance, compliance and related concepts brought up in the book, Big Data: A Revolution That Will Transform How We Live, Work, and Think (the book is also available as an audiobook and hey while I am at it, here's the link to the e-book ). In the last two posts we explored the more tactical examples of how big data can assist auditors in executing audits resulting in a more efficient and effective audit. The book also examines the societal implications of big data. In this instalment, we look explore the privacy implications of big data.
What's are the privacy implications of Big Data?
In the past 3 instalments, we've explored the opportunities that big data affords to audit profession and society at large. In this article we look at the privacy implications raised by the book.
When we think of a totalitarian state we flash back to the regimes of world war II or the Soviet era. The book talks about how the East German Communist State invested vast amounts of resources on gathering data from its citizens in order to see who conformed with the state's ideology and who didn't. The book notes that East German secret police (the Ministerium für Staatssicherheit or "stasi") accumulated (amongst other things) 70 miles of documents. However, now big data analytics essentially enables corporations and governments to mine the digital exhaust people leave through social media, using their cell phones or logging into their email accounts and essentially eliminate the privacy people have.
Some may point to anonymization as a potential solution to the problem. However, the authors highlight how New York Times reporters were able to comb through anonymized data published by AOL to positively establish the identity of the users. This highlights that the powerful tools that have emerged from big data alter the privacy landscape. Consequently, privacy controls need to be rethought from this perspective.
The authors, however, raise a much more interesting point when discussing privacy in the era of big data. They highlight the conflict between privacy and profiting from big data. They note how the value of big data emerges from the secondary uses of big data. However, privacy policies require the user to consent to a specific uses of data at the time they sign up for the service. This means future big data analytics are essentially limited by what uses the user agreed upon sign-up. However, corporations in their drive to maximize profits will ultimately make privacy policies so loose (i.e. to cover secondary uses) that the user essentially has to give up all their privacy in order to use the service. What the authors propose is an accountability framework. Similar to how stock issuing companies are accountable to the security regulators, the idea is that organizations would be accountable to a privacy body of sorts that reviews the use of the big data and ensures that companies are accountable for the negative consequences of the data.
For those of use that have been involved in privacy compliance, such an approach would make it real for companies to deal with the privacy issues in proactive manner. We saw how companies attitudes towards controls over financial reporting shifted from mild interest (or indifference) to active concern with the passage of Sarbanes-Oxley. In contrast, no similar fervour could be found the business landscape when addressing privacy issues. Although the solution is not obvious, the reality is that companies will make their privacy notices meaningless in order to reap the ROI from investments made in big data.