A primer on Big Data and its value
What is ‘big data’ and should you care? More than half of organizations are reported as having a strategy to leverage big data to improve their business. But what is it? Should you care?
Computerworld has published Strategy Guide: Big Data, sponsored by EMC, that does what I think is a very good job of explaining what it is and why it is important in simple, non-technical (for the most part) language. It also touches on a number of the issues that organizations must face as they try to manage and protect all this data.
All of this data can be converted into useful information (through business analytics) to run the business better. It enables you to see what is happening now and also to anticipate, in some cases, what is likely to happen in the future. The data comes not only from traditional sources, such as enterprise applications, but new sources, such as social media.
The data can also be mined for risk and assurance purposes, such as:
- Risk monitoring
- Continuous monitoring
- Continuous auditing
- Fraud detection
- Information security risk assessment
The piece that is missing from the Computerworld study is the problem of speed. These analytics need to run against perhaps billions of records every day (or more frequently). How can you do that with traditional technology, where the analytics will take pretty much all day to run?
For example, I know of a California bank that wants to analyze all its ATM transactions every day, but the report takes 9 hours to run because there are 1.8 billion transactions to sift through.
There’s a retail grocery store company in Europe that wants to compare its inventory levels against sales for every item in every store, and adjust prices continuously to attract customers, optimize revenue, and avoid excess inventory. How can it do that with traditional technology?
What about the financial services institution that wants to monitor social media and other activity (within its corporate systems, for privacy reasons) to identify where there is a risk of loss of key personnel. Again, we are talking about massive volumes of data.
A relatively new and exciting possibility is ‘in-memory’ computing. The general idea is that the data to be analyzed is not stored in a traditional data warehouse, but in memory. This is much faster to analyze and organizations can run a report that used to take 9 hours in just a few seconds. Oracle has reported that reports can be run up to 50,000 times faster and SAP (my employer) has experienced even faster responses. (Click on the links above for more information).
Questions you might want to consider asking include:
- Does the organization have a strategy to leverage big data? What are the goals and timelines? If not, why not? Do we understand the potential?
- Are we ready to manage the explosion of data? Do we have the necessary resources and expertise?
- Have we partnered with software vendors and consultants so we can obtain acquire the right solutions to optimize the benefit at an acceptable cost and in good time?
- Can we protect and secure the data that is stored in or passes through our network? IDC points out that 80% of all the data created in the world, including data created by individuals on social media, is on corporate networks at some point in its life?
- Do we have a reasonable level of controls to manage the risks of incomplete or unreliable data? Are the appropriate security, risk, and assurance personnel involved to guide the big data initiatives?
- Has the potential to use this for continuous monitoring/auditing been considered?
I welcome your thoughts and experiences.
Recent Posts on this Blog
- Cyber and reputation risk are dominoes February 18, 2017
- The current state of risk management February 11, 2017
- When an acceptable level of risk is not acceptable February 4, 2017
- How to mess up your risk management program January 28, 2017
- The value of a risk register January 21, 2017
- Risk in the Fourth Dimension January 15, 2017
- How much cyber risk should an organization take? January 7, 2017
- The real risks: the ones not in the typical list of top risks December 31, 2016
- An expert shares his views on the future of risk management December 18, 2016
- Selecting software to help manage user access risk December 17, 2016
- User access risk and SOX compliance December 12, 2016
- Risk and Culture December 9, 2016
- New guidance on operational risk December 3, 2016
- Why do so many practitioners misunderstand risk? November 26, 2016
- A new front opens in the SOX battle November 20, 2016
- Cyber Root Cause Alarm Bells Are Ringing February 20, 2017
- Reports That Provide Actionable Information February 14, 2017
- What Is Holding the Company Back? February 6, 2017
- Do Internal Audit Reports Matter? February 1, 2017
- Monitoring Laws and Regulations and Their Effect on Your Organization January 24, 2017
- An Important Cyberrisk Framework January 16, 2017
- Deloitte Shares a List of "Risk" Trends to Watch in 2017 and Beyond January 9, 2017
- What Does the New Year Hold for Internal Audit? January 5, 2017
- The Decision-maker's View of Risk December 19, 2016
- How Much Cyberrisk Should We Take? January 4, 2017