Tuesday, February 17, 2015

Understanding the Audit Opinion: "He's a good guy"

One of the challenges of understanding the value proposition of an external audit is to understand the link, or more accurately the lack of a link, between the work done and the actual output to the client: the actual audit opinion.

 

Essentially, the audit firm is vouching for the financial story that the client is communicating on an annual basis. It is akin to vouching someone in social circles and saying that he or she is a "good guy or gal". 

The interesting thing is that amount "evidence" you have to present to backup your claim can change greatly depending on the context. Consider the following scenarios.  
  • He's good guy to hang out with: This is the lowest risk situation due to the low impact: if the guy is actually not good it's just one night wasted. Evidence required: pretty much your word. 
  • He's good guy to hire: Risk here is (your) reputation risk, if he actually is a bad employee then your reputation is tarnished. The risk on the employer is actually quite low - they can fire the guy if he's not good. Evidence required: Although ultimately it is still your word, as the employer will rely on you to know that your friend's resume is true from experience to protect your reputation. 
  • He's good guy to loan money to: now it gets interesting! Here your friend is going to impart cash to your friend based on your testimony.  Evidence required: you will likely need to explain how he's paid you back, has a job or rich parents that can pay for him if he doesn't do so. 
  • He's good guy to do business with: Here your friend is not only going to impart cash to your friend based on your testimony, but share their work life with him and rely on him to actually do things.  Evidence required: you will likely need to explain how he's reliable, hardworking, previous work experience, how much money he will invest in the business and need to prove these things somehow. 
  • He's good guy to live with:  It's debatable whether more assurance is required in this context or the previous context. The thing with business is that you are tied in for the long run whereas most leases only go a year :). Evidence required: You need to have lived with them and provide first hand evidence about how clean there, easy to get along with and overall considerate.
I realize that this analogy is not perfect: purpose that the financial statements are issued are for the same decision and the risks of material misstatement or audit failure are the key drivers of work. However, it illustrates that the output - an opinion on an issue - remains consistent but the amount of work can vary greatly. 




Thursday, February 12, 2015

Google's Mobile Carrier Move: A Big Data Play?

Google  has decided to enter the mobile market. It has deals with Sprint and T-Mobile to operate as a mobile virtual network operator (MVNO), where they will rent the lines from these two carriers. As per the WSJ, there are no details on plans, coverage and other such details.

This is not the first time that Google has ventured beyond its traditional online offerings. A few years ago it began to offer super fast fiber internet and TV services to Kansas as well as other US cities. The move, like this one, attracted much attention because it was seen as something to alter the competitive landscape. However, moves like this shows that there is more to Google than a bunch of engineers who are just interested in building things like this:


Although some may dismiss this as a toy, it is actually an "exponential technology" that will shift fundamentally how society will function. But I digress and that is the subject of a different post.

Hearkening back to business strategy class and the infamous Porter 5 forces model, Google is cutting out a key area of risk: the last mile to the customer. By inserting itself as the mobile operator it ensures that it can deliver its services (e.g. Search, Gmail, Google Docs, etc.) and content (e.g. YouTube) straight to the customer without any interference from the Internet provider.Google could use a strategy like some Canadian cable TV providers,  where they offer a streaming video service (E.g. Shomi) that does not count against the bandwidth. 

But perhaps the hidden strategic objective is a big data play: what could Google do with the new data feeds? Sure they already get from being able to correlate the information it already gets from their Android devices. However, they will now be able to analyze this data with the additional data that moves through their MVNO network, such as demographic information and location data. What good is this to Google? In a word: advertising. Advertising is still the biggest source of Google's revenue and adding this pool of data to their reservoir can only add to the bottom line.

Friday, January 30, 2015

Is this the 80/20 of Security?

For the past 10 years or so, I have been teaching what has been considered the IT prep course for the major exam students right in Canada to get their CA designation. Now with the merger of the accounting designations in Canada, the revised CPA Competency has altered the focus on IT and reduced it. However, the upside of this, is now the course I teach can be more about what's useful from a practical perspective. In the past I taught security as a list of controls:

  • Security Architecture/Boundary
  • Policies and Standards
  • Asset Classification & Management
  • Risk Assessment
  • Personnel Qualification & Trustworthiness 
  • Responsibility & Accountability
  • Security Awareness
  • User Access Management
  • Physical Access Controls 
  • Network Access and Communication Control 
  • Logical Access Controls 
  • Intrusion Detection & Response
  • Eliciting Compliance
  • Monitoring & Learning
But I thought how do you think about security conceptually? So I thought about using the SysTrust definition of a system as the way to group the key InfoSec controls. Here's what I came up with:




What do you think? 

Below are some notes from the deck that elaborates on the above.


Risk Assessment
  • Key components of risk analysis? Risk = Impact X Likelihood


Governance
  • Governance, responsibilities & accountabilities 
  • Develop security function 
  • “tone at the top”: CEO has ultimate responsibility
  • CISO versus no CISO: 
  • Would you trust a bank without a CISO? How about a hotel?
  • Board & Management
  • Security integral part of IT governance
  • Funding security function
  • Average 6 to 7% of the IT Budget
  • Manage security risk that emanates from relationships with third parties
  • Policies & standards
  • Policies and standards:
  • Serious about security: take steps needed
  • Consult ISO 27001/2, etc. 
  • Have a methodology, define risk appetite, etc.
  • Manufacturing versus cloud computing provider 
  • Other
  • Define security roles
  • Define security responsibilities for everybody
  • Role for internal audit 


People
  • Background Checks
  • Human resource procedures to verify background work history of new hires.
  • Check qualifications
  •  Employees first line and last line of defense
  • E.g. Insider threat
  • Incentives: fire bottom 20% = problem?
  • Acceptable Use Policy
  •  Acceptable Use Policy
  •  Provides limits as to how computing facilities can be used, e.g. LAN, laptops, PDAs, etc
  •  Level personal of use
  • Controls
  •  Awareness/Orientation training/Sign statement
  •  Block sites (hotmail, gmail, facebook, etc)
  •  Monitor usage 
  • Security Awareness & Training
  • New employee training
  • Need to communicate policies and standards to employees, customers (e.g. online banking), suppliers, service providers (e.g. SLA), etc
  • Marketing Security: Remind employees regularly 
  • Provide easy access to policies
  • Policies need to be properly worded (should vs must)
  • Workshops/Tutorials on security: e.g. encrypting USB
  • Awareness posters, screensavers
  • Automate security
  • Termination
  • Terminate all access upon on letting an employee go
  • Must make part of HR processes
Data
  • Asset Classification
  • Data Classification
  •  Sensitivity: impact of  unauthorized disclosure; privacy, confidentiality
  • Public, internal, confidential, highly confidential
  • Inventory & Asset Management (Data > Devices)
  • Devices and information held; incl. outsourced entities
  • Classification drives who can access and modify the information
  •  Cost-benefit analysis: encrypt what needs to be encrypted
  • Monitor access to sensitive systems, files, databases,   
  • Encryption
  • Used to prevent data alteration, unauthorized viewing, verify authenticity
  • Depend on mathematical algorithms to transform data, 
  • "Key" is the  data that is that is used to make an encryption or decryption unique 
  • Rely on mathematical algorithms
  • private key system - receiver must know what key is used to encipher message. Such keys must be protected
  • public key system - use 2 keys
  • encipher  is made public
  • different key used to decipher
  • Encryption Standards
  • Algorithm + Key
  • DES, AES:  Private Key (symmetric) Algorithms
  • RSA:  Public Key Algorithm
  • PGP:  Open source equivalent of RSA
  • 128, 256 bit technology (length of key - longer keys are harder to break with brute force methods)
  • In a good approach, the security should be in knowledge of the key, not the encryption algorithm
  • Wireless: WEP is no good, use WPA, e.g. TJX
  • Data Retention and Disposal Policy
  • Data should be retained based on reg/stat/oper
  • If retain longer than required could be breached
  • Data should be destroyed after its no longer needed
  •  Secure overwriting, degaussing, (not formatting!)
  •  Physical destruction (e.g. incineration, shred, etc)
  •  Integrate into asset disposal/sale process

Infrastructure
  • Network: Firewall 
  • Firewall
  • “Filters” traffic from inside to outside & outside to in
  • Permits traffic based configuration
  • Protected against tampering
  • Packet filter
  • Intrusion Detection/Prevention
  • Intrusion Detection System (IDS)
  • Firewall: Permit/Blocks, IDS Analyzes activity
  • Analyzes user activity: threat score
  • Sends alerts to security admin: problem with false positives - may dismiss actual threat 
  • IPS can log off users
  • IDS: Can it detect encrypted attacks?
  • Link to SDLC?
  • Physical access controls
  • Safeguard against physical abuse, damage and destruction.
  • Isolation and restriction - use locks, effective key management, video, sensing devices
  • Tailgating: Man-trap, awareness
  • Locations of Systems: away from fire water sources (e.g. kitchen)
  • Hardening
  • Physical Access Control Considerations
  • Cost
  • Number of Type I (False negative) and Type II (False positive)
  • Average response time
  • Ability to manage multiple users
  • Satisfy ergonomic issues (E.g. retinal scan is quite invasive)
  • Virtual Private Network (VPN)
  • Virtual Private Network
  •  Encrypted/authenticated access to the network,
  • Modem lines create problems
  • Callback modems: modem will call back a pre-specified number
Software
  • Access management
  • What are the trade offs?
  • Access management
  • Privilege management
  • Log and review this type of access
  • Enables Segregation of duties
  • Separate user and information system roles, separate within information system group
  • Development and data entry
  • Separate within user role as to incompatible functions
  • initiation and authorization of transactions, recording of transactions, custody of assets, and reconciliation  
  • Logical Access Controls
  • User ID:
  • Linked to name, mdatardina@deloitte.ca 
  • Based on job: Accountspayable@xyz.com
  • No association: User12@xyz.com  Problem?
  • Logical Access Controls
  • Authentication - user is who says he/she is
  • Passwords
  • Random vs user generated
  • Rule based: What are the rules?
  • Phrases: Cat jumped over the lazy dog in Sarnia Cjotldis1
  • Plastic magnetic-strip cards 
  • Example?
  • Smart cards 
  • Example?
  • Biometric devices - fingerprints, hand geometry, eye retina patterns; consider Type I/Type II
  • Access control software- allows controlled access - locks out illegitimate users, e.g. Active Directory for Windows
  • Increased use of single-sign-on: authenticate once across multiple platforms
    • Pro: ease-of-access
    • Con: break one password, can break into multiple systems
  • Could also use profile management 
  • Allocate standard access privileges to users based on their group, rather than individual basis, e.g. AP clerk can access AP, network, office suite, etc
  • Reduces admin costs and allows easier access and rule setting
  • Anti-Virus Controls  
  • Anti-virus software
  • Installed and configured properly
  • Update regularly 
  • Won’t help against zero day
  • Ensure automated scans are scheduled.
  • Scan network
  • Scan desktop
  • Run at sign-on
Author: Malik Datardina, CPA, CA, CISA. Malik works at Auvenir as a GRC Strategist that is working to transform the engagement experience for accounting firms and their clients. The opinions expressed here do not necessarily represent UWCISA, UW, Auvenir (or its affiliates), CPA Canada or anyone else.


Friday, January 23, 2015

Windows 10: Microsoft Strategic Plays hidden in its free OS upgrade!

In a previous post, we posted the integration of Cortana into the upcoming release of Windows 10. Well, the excitement continues - Joe Belfiore walks us through a number of features:


This includes:

  • Continuum: Not only is the start menu back, but the start menu adjusts for desktop mode and (touch) tablet mode. 
  • Cortana: He confirms what we saw last time, but he couldn't risk a dig at Siri. But to be fair, Cortana has more of the "virtual digital assistant" features which incorporate artificial intelligence and machine learning to, as he shows, book appointments and reminders. 
  • Built-in Apps: Microsoft is offering calendar, photo, maps, video, mail and xbox apps. 

Although these key features are exciting, the bigger deal is how Microsoft is working to recapture market share from its competitors. 
  • Free upgrade if you have Win7, 8, or 8.1! Yes, that's right for a year people upgrade for free to W10! This obviously good news for consumers. However, it appears that businesses can also upgrade which could be the real benefit: Microsoft effectively is facilitating the move to the next version thereby reducing the risk that companies will stick with an OS for decade (i.e. like they did with XP). The free upgrade also will go a long way to build bridges with customers who were unhappy with start menu disappearing in Win 8. 
  • Windows Phone and XBox integration: The apps mentioned work across devices. Although it is not clear, it appears that to get the most out of the W10 features, you need to get a W10 phone. Although this seems like a long shot, it shows that Microsoft is not giving up anytime soon on the mobile phone space. XBox integration enables Microsoft to further capture space within the living room entertainment space, competing with the likes of Roku, Apple TV and the Google Chromecast. 
  • Bringing social to the web browser:  Losing market share to Google Chrome over the past few years, Microsoft appears to be striking back with Project Spartan, The browser offers enhanced usability features (tabs, reading,etc), but also has a a productivity play where users can annotate websites and then share their annotations via social apps.    
The new Windows 10 looks pretty amazing and now that it's free I really can't wait to try it! 

Wednesday, January 21, 2015

Amazon to Canadian Customers: These features are for US Only

Over the break, I decided to abandon my Samsung S4 and move to the Note 4. I made the mistake of buying the first cell at an independent cell shop thinking that I would get ahead on contracts. But it had chronic issues with getting the IMEI to work and then the speaker didn't work either. Never had these problems before when buying on contract. So back on contract I go.

Device is beautiful:


I had the first iteration of the device, the Note 1. This is much better and the pen seems to be much improved in terms being able to write by hand. Much crisper screen as well. Anyways, I digress.

Given the decent size of the device, I figured this would be the perfect opportunity to try out the immersion reading feature. Immersion reading enables you to hear and read the book at the same time, giving you the benefits of both the audio learning and the visualizing the text.

Much to my dismay I figured out after hours of trying to make this work that it didn't. Why? Because I am in Canada.

However, it was only after confirming my suspicions with this post that I realized the truth: Canadians are locked out of yet another one of Amazon's services. Now the real issue is why doesn't Amazon openly say this on the amazon.ca website? In fact, I even corresponded with the help and assumed that they would warn me that this feature does not work. As you read the posts, I was not the only one that was frustrated by this state of things

The incident reveals a couple of things. Firstly, Amazon is a US focused company, it does not serve Canada well. For example, it's prime video service is only in the US. This is just another example. I wonder why Chapters has not exploited this?

Secondly, it exposes the weakness with mass service model used by Amazon as well as cloud computing companies, it can serve the masses but not the unique needs of particular clientele.



Monday, December 29, 2014

Low Decision Agility: BigData's Insurmountable Challenge?

Working in the field of data analytics for over decade there is one recurring theme that never seems to go away: the overall struggle organizations have with getting their data in order.

Courtesy of this link
Although this is normally framed in terms of data quality and data management, it's important to link this back to the ultimate raison d'etre for data and information: organizational decision making. Ultimately, an organization has significant data and information management challenges it culminates into a lack of "decision agility" for executive or operational management. I define decision agility as follows:

"Decision agility is the ability of an entity to provide relevant information 
to a decision maker in a timely manner." 

Prior to getting into the field, you would think that with all the hype of the Information Age it would be easy as pressing a button for a company to get you the data that you need to perform the analysis you need to do. However, after getting into the field, you soon realize how wrong this thinking: most organizations have low-decision agility.

I would think it is fair to say that this problem hits those involved in external (financial) audits the hardest. As we have tight budgets, low-decision agility at the clients we audit makes it cost-prohibitive to perform what is now known as audit analytics (previously known as CAATs). Our work is often reigned in by the (non-IT) auditors running the audit engagement because it is "cheaper" do the same test manually rather than parse our way through the client's data challenges

So what does this have to do with Big Data Analytics?

As I noted in my last post, there is the issue of veracity - the final V in the 4 Vs definition of Big Data. However, veracity is part of the larger problem of low decision agility that you can find at organizations. Low-decision agility emerges from a number of factors and can have implications on a big data analytics initiative at an organization. These factors and implications include:

  • Wrong data:  Fortune, in this article, notes there is the obvious issue of "obsolete, inaccurate, and missing information" data records itself. Consequently, the big data analytics initiative needs to assess the veracity of the underlying data to understand how much work needs to be done to clean up the data before meaningful insights can be drawn from the data. 
  • Disconnect between business and IT: The business has one view of the data and the IT folks see the data in a different way. So when you try to run a "simple" test it takes a significant amount of time to reconcile business's view of the data model to IT's view of the data model. To account for this problem there needs to be some effort in determining how to sync the user's view of the data and IT's view of the data on an ongoing basis to enable the big data analytic to rely on the data that sync's up with the ultimate decision maker's view of the world.  
  • Spreadsheet mania: Let's fact it: organizations treat IT as an expense not as an investment. Consequently, organizations will rely on spreadsheets to do some of the heavy lifting for the information processing because it is the path of least resistance. The overuse of spreadsheets can be a sign of an IT system that fails to meets the needs of the users. However, regardless of why they are used, the underlying problem is dealing with these vast array of business-managed applications that are often fraught with errors and outside the controls of production system. The control and related data issues become obvious during compliance efforts, such as SOX 404 or major transitions to new financial/data standards, such as the move to IFRS. When developing big data analytics, how do you account for the information trapped in these myriad little apps outside of IT's purview? 
  • Silo thinking: I remember the frustration of dealing with companies that lacked a centralized function that had a holistic view of the data. Each department would know it's portion of the processing rules, etc. but would have no idea of what happened upstream or downstream. Consequently, an organization needs to create a data governance structure that understands the big picture and can identify and address the potential gaps in the data set before it is fed into the Hadoop cluster.  
  • Heterogenous systems: Organizations with a patch-work of systems require extra effort from getting the data formatted and synchronized. InfoSec specialists deal with this issue of normalization when it come to security log analysis: the security logs that are extracted from different systems need to have the event IDs, codes, etc. "translated" into a common language to enable a proper analysis of events that are occurring across the enterprise. The point is that big data analytics must also perform a similar "translation" to enable analysis of data pulled from different systems. Josh Sullivan of Booz Allen states: "...training your models can take weeks and weeks" to recognize what content fed into the system are actually the same value. For example, it will take a while for the system to learn that female and woman are the same thing when looking at gender data. 
  • Legacy systems:  Organizations may have legacy systems which do not retain data, are hard to extract from and difficult to import into other tools. Consequently, this can cost time and money to get the data into a usable format that will also need to be factored into the big data analytics initiative.
  • Business rules and semantics: Beyond the heterogenity differences between systems there can also be a challenge in how something is commonly defined. A simple example is currency: an ERP that expand multiple countries the amount reported may be in the local currency or the dollar, but requires the metadata to give that meaning. Another issue can be that different user group define something different. For example, for a sale for the sales/marketing folks may not mean the same thing as a sale for the finance/accounting group (e.g. the sales & marketing people may not account for doubtful accounts or incentives that need to be factored in for accounting purposes). 
Of course these are not an exhaustive list of issues, but it gives an idea of how the reality of analytics is obscured the tough reality of state of data.  

In terms of the current state of data quality, a recent blog post by Michele Goetz of Forrester noted that 70% of the executive level business professionals they interviewed spent more than 40% of their time vetting and validating data. (Forrester notes the following caveat about the data: "The number is too low to be quantitative, but it does give directional insight.")

Until organizations get to a state of high decision agility - where business users spend virtually no time vetting/validating the data - organizations may not be able to reap the full benefits of a big data analytics initiative. 



Tuesday, December 23, 2014

How would you explain BigData to a business professional? (Updated)

Most people are familiar with the 4 Vs definition of Big Data: Volume, Variety, Velocity and Veracity. (And if you are not here is an infographic courtesy of IBM:)


I have written about the Big Data in the past, specifically, on its implication on financial audits (here, here, and here) as well as privacy. However, I was meeting with people recently and were discussing big data and I found that business professional understood what it was divorced from it operational implications. This is problematic as the potential for big data is lost if we don't understand how big data has changed the underlying analytical technique.

But first we must look at the value perspective: how is big data different from the business intelligence techniques that business have used for decades?

From a value perspective, big data analytics and business intelligence (BI) ultimately have the same value proposition: mining the data to find trends, correlations and other patterns to identify new products and services or improve existing offerings and services.

However, what Big Data really is about is that previous analytical technique that was limited due to technological constraints no longer exists. What I am saying is that big data is more about how we can do analysis differently instead of the actual data itself. To me big data is a trend in analytical technique where the volume, variety, or velocity is no longer an issue in performing analysis. In other words - to flip the official definition into an operational statement - the size, shape (e.g. unstructured or structured), speed - is no longer an impediment to your analytical technique of choice.

And this is where you, as a TechBiz Pro, need to weigh the merits of walking them through the technological advances in the NoSQL realm. That is, how did we go from the rows & columns world of BI to the open world of Big Data?  Google is a good place to start. It is pretty good illustration of big data techniques in action: using Google we get extract information from the giant mass of data we know as the Internet (volume), within seconds (velocity) and regardless if it's video, image or text (variety). However, Internet companies found that the existing SQL technologies inadequate for the task and so they went into the world of NoSQL technologies such as Hadoop (Yahoo), Cassandra (Facebook), and Google's BigTable/MapReduce. The details aren't really important but the importance lies in the fact that these companies had to invent tools to deal with the world of big data.

And this leads to how it is has disrupted the conventional BI thinking when it comes to analysis.

From a statistical perspective, you no longer have to sample the data and extrapolate to the larger population. You can just load up the entire populations, apply your statistical modeling imagination to it and identify the correlations that are there.  Chris Anderson, of Wired, noted that this is a seismic change in nothing less than the scientific method itself. In a way what he is saying is that now that you can put your arms around all the data you no longer really need a model. He did get a lot of heat for saying this, but he penned the following to explain his point:

"The big target here isn't advertising, though. It's science. The scientific method is built around testable hypotheses. These models, for the most part, are systems visualized in the minds of scientists. The models are then tested, and experiments confirm or falsify theoretical models of how the world works. This is the way science has worked for hundreds of years.

But faced with massive data, this approach to science — hypothesize, model, test — is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on."

Science aside the observation that Chris Anderson makes has big implications for business decision making. Advances in big data technologies can enable the deployment of statistical techniques that were previously not feasible and can yield insights without having to bother with model development. Statisticians and data scientists can play with the data and find something that works through trial and error. From financial audit perspective, this has tremendous implications - once we figure out the data extraction challenge. And that's where veracity comes in, which is the topic of a future blogpost.

But to close on a more practical level, companies such as Tesco are leveraging big data analytics to improve their bottom. An example, courtesy of Paul Miller from the Cloud of Data blog/podcast site, is how Tesco extracted the following insight: “[a] 16 degree sunny Saturday in late April will cause a spike. Exactly the same figures a couple of weeks later will not, as people have had their first BBQ of the season”. In terms of overall benefits to the company, he notes “Big Data projects deliver huge returns at Tesco; improving promotions to ensure 30% fewer gaps on shelves, predicting the weather and behaviour to deliver £6million less food wastage in the summer, £50million less stock in warehouses, optimising store operations to give £30million less wastage.”