A primer on Big Data and its value
What is ‘big data’ and should you care? More than half of organizations are reported as having a strategy to leverage big data to improve their business. But what is it? Should you care?
Computerworld has published Strategy Guide: Big Data, sponsored by EMC, that does what I think is a very good job of explaining what it is and why it is important in simple, non-technical (for the most part) language. It also touches on a number of the issues that organizations must face as they try to manage and protect all this data.
All of this data can be converted into useful information (through business analytics) to run the business better. It enables you to see what is happening now and also to anticipate, in some cases, what is likely to happen in the future. The data comes not only from traditional sources, such as enterprise applications, but new sources, such as social media.
The data can also be mined for risk and assurance purposes, such as:
- Risk monitoring
- Continuous monitoring
- Continuous auditing
- Fraud detection
- Information security risk assessment
The piece that is missing from the Computerworld study is the problem of speed. These analytics need to run against perhaps billions of records every day (or more frequently). How can you do that with traditional technology, where the analytics will take pretty much all day to run?
For example, I know of a California bank that wants to analyze all its ATM transactions every day, but the report takes 9 hours to run because there are 1.8 billion transactions to sift through.
There’s a retail grocery store company in Europe that wants to compare its inventory levels against sales for every item in every store, and adjust prices continuously to attract customers, optimize revenue, and avoid excess inventory. How can it do that with traditional technology?
What about the financial services institution that wants to monitor social media and other activity (within its corporate systems, for privacy reasons) to identify where there is a risk of loss of key personnel. Again, we are talking about massive volumes of data.
A relatively new and exciting possibility is ‘in-memory’ computing. The general idea is that the data to be analyzed is not stored in a traditional data warehouse, but in memory. This is much faster to analyze and organizations can run a report that used to take 9 hours in just a few seconds. Oracle has reported that reports can be run up to 50,000 times faster and SAP (my employer) has experienced even faster responses. (Click on the links above for more information).
Questions you might want to consider asking include:
- Does the organization have a strategy to leverage big data? What are the goals and timelines? If not, why not? Do we understand the potential?
- Are we ready to manage the explosion of data? Do we have the necessary resources and expertise?
- Have we partnered with software vendors and consultants so we can obtain acquire the right solutions to optimize the benefit at an acceptable cost and in good time?
- Can we protect and secure the data that is stored in or passes through our network? IDC points out that 80% of all the data created in the world, including data created by individuals on social media, is on corporate networks at some point in its life?
- Do we have a reasonable level of controls to manage the risks of incomplete or unreliable data? Are the appropriate security, risk, and assurance personnel involved to guide the big data initiatives?
- Has the potential to use this for continuous monitoring/auditing been considered?
I welcome your thoughts and experiences.
Recent Posts on this Blog
- Risk and Culture December 9, 2016
- New guidance on operational risk December 3, 2016
- Why do so many practitioners misunderstand risk? November 26, 2016
- A new front opens in the SOX battle November 20, 2016
- Internal audit reports do the function a great disservice November 12, 2016
- My new book on Auditing that Matters is available November 9, 2016
- Time for a leap change in risk management guidance November 5, 2016
- Cyber security and the board October 29, 2016
- The biggest obstacle to effective risk management October 28, 2016
- A revolution in risk management October 22, 2016
- Why do people commit fraud? October 14, 2016
- What could go wrong with strategy and its execution? October 6, 2016
- Is a new maturity model for GRC the right model? September 25, 2016
- The Wells Fargo “Staff Scam”: More questions and fewer answers September 16, 2016
- The astonishing Wells Fargo fraud September 10, 2016
- How Much Cyberrisk Should We Take? December 9, 2016
- Do We Know How to Audit Technology-related Risks? December 5, 2016
- The State of Information or Cybersecurity November 28, 2016
- Back to the Future for Internal Audit November 21, 2016
- How Do You Change the Culture of the Organization? November 15, 2016
- Why Does ERM Fail So Often? November 7, 2016
- Incentives and Ethics: Transparency International Speaks Out October 31, 2016
- A COSO Gem Helps Assess Risks and Related Control Deficiencies October 25, 2016
- Focusing on the Wrong Line of Defense October 17, 2016
- Internal Audit and the Internet of Things October 10, 2016