Taking big advantage of big data to drive big improvements in performance
One of the radical and disruptive elements of the digital world is the explosion of data, both structured and unstructured, that can be mined and turned into valuable nuggets of information. Just to set the scene, let me quote from the Financial Times of Germany (March 3, 2012):
Every day produces some 2.5 quintillion bytes of digital data. A quintillion is a number with 30 zeros. This mass of data can be worth a lot of money when it can be stored, channeled and analyzed. There is currently big hype in the IT industry around analysis and storage means for so-called ‘big data’. Developers of databases like SAP’s subsidiary Sybase are adapting their products to the new demands of big data analyses. Theo Ruland, Managing Director, Sybase, said that speed was vital, even when analyzing complex data. An analysis which might have taken two or three hours in the past can now be done in a second, he added. “Say we’re looking for data about green-eyed butchers in Duesseldorf, Germany. In the past we would have had to go through all the lines in a database searching for a match.” Now, he went on to explain, it is possible to look for everybody in Duesseldorf, and then to filter first for butchers and finally the green-eyed among them – a huge leap forward in terms of speed.”
A new report from Aberdeen, In-memory computing, lifting the burden of big data (registration required), is an independent look at the opportunity presented by ‘big data’ and the use of in-memory computing. In their summary, Aberdeen says:
Business data is growing at an average of 36% per year… and organizations of all sizes, across all countries and industries, have to face the challenge of scaling up their data infrastructure to meet this new pressure. Advances in server hardware and application design have led to a potential solution: in-memory computing. In a December 2011 study on the current state of Big Data, Aberdeen’s research showed that organizations that had adopted in-memory computing were not able to analyze larger amounts of data in less time that their competitors – they were literally orders of magnitude faster.”
As Aberdeen points out, business managers simply can’t get information fast enough – and that is the major driver for investment in the new technology.
As I have written before, business is getting faster and faster. Decisions have to be made quickly and the availability of current, reliable information will enable more informed and therefore better decisions. Better decisions lead to enhanced performance.
It’s not only that information can be obtained faster (Aberdeen says over 100 times faster – although SAP customers report results over 100,000 times as fast). But, a report will very often require further analysis – and a second and third report needed to answer those questions. Reports that used to take many hours now take seconds, and decisions can be made at speed.
This power can be used as a form of continuous monitoring to identify:
– Trends and anomalies in operational data requiring action
– Changes in risk levels
– Indicators of potential fraud
One use I can see is changing the controls relied upon to manage risk. With in-memory technology, detective controls can be implemented that are almost real-time. The ability to intervene before loss or error means that it may be easier and less-expensive to rely on detective controls than more expensive preventative controls.
The possibilities are huge, and I can understand why so many organizations are making major investigations in the new technology. Just think of the competitive advantage that can be obtained when you can analyze 2 billion records in 10 seconds (as a US bank has done using SAP’s HANA technology) and your competitor has to wait a day or two.
Time is money.