Better Performance through Insight: How Technology Can Get You from 80/20 to 20/80

Better Performance through Insight: How Technology Can Get You from 80/20 to 20/80

Nick Fischer, Ex-Senior Vice President & CFO, Betteridge
Nick Fischer, Ex-Senior Vice President & CFO, Betteridge

Nick Fischer, Ex-Senior Vice President & CFO, Betteridge

Several key ingredients contribute to your success in implementing a performance management infrastructure and achieving sustained, profitable growth. First, establishing measurable performance goals and priority objectives with clear milestones help you define what success looks like. Second, creating regular forums to review actual performance and progress drives accountability and fosters an environment where stakeholders can work together to solve problems, evaluate risks and discuss opportunities. Third, leveraging data and analytics to enhance decision-making results not just in better planning, but in improved responses to actual performance that’s trending below expectations. In terms of performance improvement, this third element often has the greatest potential but is frequently overlooked because, in many cases, the data is inaccessible and complex. It doesn’t have to be this way. By simplifying how you approach data and leveraging technology, almost any organization can improve performance through the power of actionable insight.

From Data to Insight

It’s helpful to simplify the analytics process into three stages: data compiling, analysis and advising. For many organizations, 80 percent of the effort is in the data-compiling stage, leaving just 20 percent for analysis and advising. Because most situations come with time and resource constraints, this 80/20 dynamic results in rushed analyses, reducing the quality of your insights and limiting your ability to produce sound recommendations. Over time, many organizations find the cost-benefit of analytics lacking and abandon the idea altogether.

  When it comes to distilling large amounts of data into actionable insights, time is almost always the most binding constraint 

To tap into the real benefits of analytics, organizations need to shift their analytics process from 80/20 to 20/80 – where only 20 percent of the effort is in the data-compiling stage, leaving 80 percent to convert analyses into quality insights and advising with actionable recommendations. This shift does not mean sacrificing the integrity of data, because the outputs can only be as good as the inputs. Rather, what it means is leveraging technology to improve the efficacy of data compiling, so that finance and analytics professionals can use more of their time and talents analyzing and advising.

Technology as an Enabler

When we approached the development of our FP&A platform architecture, our primary objective was to leverage technology to minimize the time and effort required in data compiling. We approached our platform architecture in three phases: data compilation and storage, data organization, and insight extraction.

Data Compilation and Storage

The technology model for data compilation and storage continues to evolve from the traditional on-premise setup to cloud-based infrastructures that provide comprehensive capabilities and applications at scale without having to break the bank. Emerging data integration processes, such as the creation of a “data factory” that leverages popular programming languages like Python, can allow organizations to establish affordable ecosystems embedded with data-driven ETL workflows and pipelines that upload disparate data stores to the cloud.

When we implemented our own FP&A platform, we designed automated workflows using Azure services to ingest data feeds from ERP, CRM, POS and other event data portals that our client partners provided. With data feeds linked to cloud-based servers that a third-party provider manages and secures, we avoided upfront capital investments and were ready for the next phase.

Data Organization

As the breadth and depth of data continues to grow at a rapid pace, so does the need to process, clean and transform it for the consumption of analytics and BI applications. For many organizations, the need to leverage big data also means the need to have enough firepower when it comes time for computation. Having the right capabilities in place to do so is paramount to getting insights into the hands of decision-makers quickly. Leveraging services such as data lakes and software applications like Hadoop and Spark that are coupled with machine-learning components enables large amounts of data from any source to be processed and prepared at the speed needed to drive action today.

In our FP&A platform implementation, we used these capabilities to build customized data structures that aggregate, transform and prepare disparate sources of data workflows and that are prepared to be integrated with our proprietary reporting and analytical processes.

Insight Extraction

Analytics applications are working to answer the demand for rapid insights by enhancing the UIs that sit on top of the underlying analytical packages, like R or SAS. Business Intelligence and data mining platforms such as Tableau are also following suit by adding statistical packages within their already user-friendly interfaces. As a result, business users in any functional area can leverage robust and statistically rigorous methodologies and garner more advanced insights and recommendations without having a degree in data science or statistics.

Again, in our FP&A platform implementation, the data organization process generated final data sets that allowed us to seamlessly feed information into the proper end-user platforms, including EPMs, Power BI and statistical packages. This process enabled stakeholders to work with the data in near real-time, efficiently and effectively.

Shifting from 80/20 to 20/80

When it comes to distilling large amounts of data into actionable insights, time is almost always the most binding constraint. You may want to maximize the amount of time you spend analyzing and advising, but the reality is that data compiling can be such an enormous undertaking that you run out of time.

To shift to a more desirable ratio between data compiling and data analysis, rethink how you approach data compilation and storage, data organization and insight extraction. Take advantage of emerging technology around data integration processes that can help you save capital. Find services designed to integrate with analytics consumption and use them to customize your own data structure. Leverage applications and platforms built specifically for efficient, easy access to insights.

Such a comprehensive approach will help you evolve to where the bulk of your process time can be allocated to building insights and developing strategies. Ultimately, this ratio is not just a number — it’s a way to capitalize on technology to improve performance and drive success.

Read Also

The Journey to Swift Digital Transformation

John Hill, Senior Vice President of Digital & Information Technology, Suncor

Will data protection law reform open the door to easier international...

Kitty Rosser, Legal Director, Head of Data Protection at Birketts

Virtual Immersive Learning: The Next Frontier in Higher Education

Dr. Frederic Lemieux, Georgetown University

Making the Case For Moving from Health IT to Health Analytics

Aaron Baird, Associate Professor, and Yusen Xia, Director of the Data Science in Business

Data as a Business

Ricardo Leite Raposo, Director of Data & Analytics at B3