This article first appeared on the OneStream blog by Zach McKeown
Every year, data inconsistencies cost organizations time and money as they chase the truth to support financial and operational reporting. Aside from the immediate impact on revenue, poor data quality over the long term leads to risk from decision-making based on inaccurate data and a lack of confidence in eXtended Planning & Analysis (xP&A) processes. Why? Because data inconsistencies confuse people, erode trust, cause angst and lead employees to seek “answers” in the data silos they know and love.
Of course, poor financial data quality leads to more than just poor decisions and risks in financial reporting. It also causes organizations to waste resources, miss opportunities, and spend far too much time fixing data – time that could be better spent on other areas of the business. And all of these impacts translate into increased costs. In fact, according to Gartner , poor data quality costs an organization $12.8 million per year, on average. Even worse? That figure doesn’t even consider the downstream implications of poor decision-making.
With the overall exponential growth of data – especially amid the rise in xP&A – the cost of poor data quality will also grow exponentially if not addressed quickly.
Understanding Why Data Consistency Matters for Financial and Operational Reporting
Whereas data redundancy occurs when the same piece of data exists in multiple places, data inconsistency occurs in different formats in various systems, which leaves an organization with the unreliable and meaningless information. Most organizations traditionally created workarounds to reduce the impact of data inconsistencies but Finance can longer support these short-sighted solutions as it transforms itself into the core entity of xP&A – providing the financial intelligence required for corporate planning, reporting, and flexible operational planning (see Figure 1). Data consistency must therefore be maintained across the enterprise to support xP&A.
Most FP&A leaders agree that having valid, accurate, and usable data for financial and operational reporting is critical. The reality, however, is that many leaders spend more time chasing data inconsistencies instead of fixing the root causes, causing a lack of trust in the data.
Here are a few real-world examples of how data inconsistencies impact xP&A processes:
- Demand Planning. Balancing customer needs for products while minimizing excessive inventory and avoiding supply chain disruptions is challenging, even in the most optimal environments. The current environment of supply chain shortages has negatively impacted growth for many companies, and some have had additional challenges of imperfect data due to inconsistencies.
- Workforce Planning. Aligning talent to ensure an organization has the right people – with the right skills in the right places at the right time – requires vast amounts of financial and operational data. Planners pulling data from multiple silos across a dis-connected planning environment must take extra steps to validate the same slices of data. Why? Because planners don’t fully trust the data – nor should they!
- Sales Planning. Organizations invest a tremendous amount of time and resources to define and support the process to create territory and incentive plans to align revenue and financial goals. Unfortunately, many sales operations teams are forced to work in fragmented tools with inconsistent data, causing multiple versions of the truth and time spent on reconciling data vs. analyzing the business and driving revenue.
Conquering Data Complexity in Financial Operational Reporting
A fully unified corporate performance management (CPM) platform (see Figure 2) with built-in Financial Data Quality Management (FDQM) is critical for organizations to align adequate financial and operational reporting across the enterprise. An essential requirement for any CPM software platform is 100% visibility from reports to sources – all financial and operational data must be clearly visible, accessible, traceable, and consistent.
Using Guided Workflow Drives Data Quality
An organization can achieve effective data consistency through standardized and simplified data collection and analysis processes with reports available at every step in the workflow. The workflows must be guided to provide standard, defined, and repeatable processes for maximum confidence and reliability in a business-user-driven process.
In-context guided workflows protect business users from complexity by guiding them uniquely through all data management, verification, analysis, certification, and locking processes. Here are just a few examples of how OneStream’s Guided Workflows can improve an organizations performance:
- Intelligent analytics are delivered within each workflow step
- Separate input channels for data protection and transparency
- Phased workflow processes for all data collections
- User and process-aware
- 100% reliable process status delivered by a unified workflow
Deploying a program to enable data consistency is not an easy task, but the rewards are enormous. Establishing a disciplined approach to managing data as an important enterprise asset will better position the organization to improve staff productivity and better serve customers.
With the rise of xP&A, the days of “dealing with” data inconsistencies are no longer an option, particularly in today’s volatile climate with ongoing global disruptions. Finance leaders and analysts who want to move forward must once again have confidence their data is accurate without compromising ease of use.
With OneStream, data is consistent across all relevant business units leveraging our unified platform – reducing the drag that data inconsistencies impose on organizational performance. Policies and technology are thus aligned, letting financial and operational leaders finally make better decisions to maximize profitability, without the fear of being blindsided by erroneous data.
To learn more about how the unrivaled power of OneStream’s Intelligent Finance Platform can help keep your data consistent, download our Financial Data Quality Management ebook. And don’t forget to tune in for additional posts from our Rise of xP&A blog series.
 Cost Optimization is Crucial for Modern Data Management Programs, Ankush Jain, Guidi De Simoni, Eric Thoo, Adam Ronthal, Melody Chien, Donald Feinberg, Ehtisham Zaidi, Sally Parker, Simon Walker, Malcolm Hawker, June 20, 2020