Managing Big Data

Kenneth Gabriel, Global Leader-ERP Advisory, KPMG
Kenneth Gabriel, Global Leader-ERP Advisory, KPMG

Kenneth Gabriel, Global Leader-ERP Advisory, KPMG

In terms of nimbly responding to market forces, satisfying customers, and reducing risk and cost, real-time capabilities afford an organization the laser-like focus and flexibility that executives once only dreamed of. The technology industry has only recently evolved to a point where this is a reality for every company.

Along with enormous opportunity, however, real-time capability challenges organizations to transform the way they manage the oceans of data, now referred to as “Big Data,” that fuel it. It’s a challenge that dominates strategic planning for both the C-suite and IT, because such a broad variety of factors are at play. In-memory computing platforms such as SAP HANA may remove current barriers between operational and financial forecasting that exist, as highlighted below.


The sheer amount of Big Data pertinent to any given enterprise is growing exponentially. This includes data the organization collects from internal sources such as transactions, operations, and supply chains, plus key data that exists outside the company, which can be equally important such as economic trends, customer desires expressed through social media and market forces.

Time frame

The time frame that is needed to access the data has been frustratingly varied. Real-time data access for many companies is still “a distant pipe dream,” as one survey of data managers and professionals term it.

The promise of Big Data, sorting through the hype

While traditional forecasting uses historical data, which is used to create forecasts using driver-based modeling, Big Data enables predictive analytics, which uses multiple interrelated variables to determine ways to optimize business. This need for more complex and agile analyses is where IT and executive leadership strategies converge.

Information and Technology Limitations

While businesses recognize the crucial importance of Big-Data analytics, they typically use only about 10 percent of their data for analysis and decision-making. Until recently, the problem has been speed and volume. “The greater the data volume and the faster that data streams into the enterprise, the longer it takes for traditional analytics and data management software to turn this data into actionable information,” industry analysts explain.

Advances in technology

A new technology solution is helping resolve these issues and open up the analytics landscape in potentially ground-breaking ways. In-memory computing is an architecture that brings the data closer to the computer’s processors, improving speed by almost dizzying proportions. For example, SAP has an in-memory capability in SAP HANA, and clients have reported that SAP HANA has reduced the time it took to run reports by up to 3,600 times; and during testing of SAP HANA, queries against 460 billion rows of data were able to be run in less than one second. This transforms analytics; from merely looking at what happened to predicting not only what will happen, but what is the best that could occur.

In areas such as planning, forecasting, and price optimization, the flow of transactional information increases. This accelerates business performance and opens up new business opportunities. SAP HANA may also be able to remove the constraints on analyzing complex, very large data volumes; it may now include unstructured and critical third-party data sources—Websites, e-mail, customer management systems, social media, sensors and more. This can help break the current barriers between operational and financial forecasting that exist, as the scalability and variability of information captured can now reside in a single data set.

In-memory computing can also reduce costs in a number of ways. Transactional and analytical data are stored as one, which lowers costs by eliminating separate data layers, disk space requirements and the indexes typically required for analytical query optimization. In addition, since fewer pieces of hardware are required to store and access data, the costs of running the equipment and cooling the server rooms are reduced. Moreover, because the SAP HANA architecture is scalable, data-storage needs may increase more slowly in comparison to disk-based computing systems.

Regardless of the size of your organization, Big Data and real time analytics will impact your future. Inmemory computing can provide such an elegantly simple, cost effective solution to fulfilling Big Data’s promise.

Read Also

Balancing Innovation and Standardization

Balancing Innovation and Standardization

Matt Kuhn, PhD, Chief Technology Officer, Innovative Technology Services, Thompson School District
Leveraging Quality Engineering and DevOps to thrive in the face of churning customer expectations

Leveraging Quality Engineering and DevOps to thrive in the face of...

Michelle DeCarlo, senior vice president, enterprise delivery practices, Lincoln Financial Group
Pioneering the Future Through Technology Innovation

Pioneering the Future Through Technology Innovation

Eric Kunnen, Senior Director, IT Innovation and Research, Information Technology, Grand Valley State University
Reimagine Naval Power

Reimagine Naval Power

Lorin Selby, Chief of Naval Research, Office of Naval Research
The Shifting Enterprise Operating System Ecosystem Is Helping Warehouse Operations Evolve

The Shifting Enterprise Operating System Ecosystem Is Helping...

Tom Lee, Director Sales Engineering, Zebra Technologies
Digital TRANSFORMATION: Challenge the Status Quo, Be Disruptive.

Digital TRANSFORMATION: Challenge the Status Quo, Be Disruptive.

Michael Shanno, Head of Digital Transformation, Global Quality, Sanofi