Put Data Quality in Motion with Passive Data Governance
Updated · Aug 21, 2014
Page Contents
By Michael Collins, BackOffice Associates
Establishing a passive data governance strategy is one of the most important steps that organizations can take in ensuring sustainable data quality and relevance. By enabling companies to monitor data that already exists in corporate systems and automatically identify and remediate exceptions to data rules, passive data governance non-intrusively enforces alignment with designated data rules, which protects and sustains data quality.
In a passive data governance strategy initiative, staff members’ day-to-day operations and data system interactions continue as usual based on a common understanding of business-relevant data rules and remediation protocols. The key to reaping the benefits of passive data governance is securing stakeholder buy-in and setting up effective business processes and remediation guidelines that drive long-term objectives.
Here are some suggestions on making a passive data governance strategy work:
Secure Executive Sponsorship
Passive data governance may sound like a great idea to the data stewardship team, but the effects will be limited without executive support. To give executives a stake in the program, tie data governance outcomes directly to executives’ scorecards, performance measurements and KPIs. More visible and higher quality data will help them assess the success of organization-wide initiatives — ranging from sales, procurement, operations, and manufacturing to finance and HR.
Once sponsorship has been secured at the executive level, corporate leaders can influence and incentivize individual units to participate based on their individual departments’ KPIs as well as those for the entire organization. Examples of best practice KPIs may include: reducing inventory to optimize efficiency; improving customer satisfaction by resolving duplicate records and incorrect billing or contact information; accelerating new product introductions to the market; speeding order fill rates; improving supplier performance through more favorable payment terms to improve cash flow or by implementing electronic transactions; optimizing employee retention through performance management; and increasing overall market and wallet share.
Measure and Manage Data Quality
It’s commonly stated that you can’t manage what you can’t measure. While this may not be relevant in all cases, once performance is measureable it is easier to quantify improvement. The fifth point of Deming’s 14 Points on Total Quality calls to “improve constantly and forever every process for planning, production and service.” This needs to start somewhere. Establishing a baseline of current data quality and associating those metrics with business performance is a good first step.
My prior article on How to Build a Data Assessment Business Case introduces this concept. There are likely very real data attributes associated with your executives’ KPIs; the challenge is to define those data rules in such a way that accuracy and completeness of the data can be measured.
Once rules and definitions are set, it is quite simple to report against them but processes around enforcing such rules should be scalable and repeatable. Many master data management teams will already have reports they run manually but this is not a sustainable model. A platform that allows data teams to catalog rules and automatically execute against them on an established schedule is needed. Results, including historical numbers, should be saved in order to demonstrate results over time with the expectation that performance will improve.
Foster Accountability for Data Quality
Once data quality metrics are tied to the executive teams’ KPIs, people will begin to take notice and give the data team the support needed to drive accountability. However the metrics alone are not enough. The most successful approaches employ a push strategy to deliver the failed data directly to the business owners. It’s folly to expect business owners to proactively seek out the failed data themselves.
Proactive notifications serve as the easiest way to deliver erroneous data. The data should be filtered so business owners only receive their data, rather than a long report that requires the recipient to perform their own filtering. By measuring failed data, attaching the results to the executive performance measurements and delivering the failed data directly to the owners, organizations will make great strides in establishing true and verifiable accountability.
Close Data Quality Loop with Remediation
Remediation is a pivotal governance step required to close the loop from the proactive failed data notifications to ensuring automated corrections. Data management teams can leverage technology to automatically send failed data as an alert to specific data owners throughout the organization. With this information at their fingertips, they can alter the data externally to the system of record, ensure the data now passes the same validation rules, and then see it posted back to the systems.
This level of automation — especially when executed with large volumes of data (think hundreds or thousands of records) — significantly accelerates the data quality cycle and allows staff to focus on strategic and growth initiatives rather than on manually correcting data day after day. It also helps ensure that data elements crossing different functions (e.g., finished products, raw materials, vendors, customers) remain cleansed and up-to-date.
The key to successful remediation is developing effective processes to correct and approve data based on global rules that can be applied and tweaked to nuances in various business units or departments. Then data management teams can measure and benchmark performance and data quality over time.
Wrapping up
Passive data governance plays an important role in proactively maintaining quality data across an organization. By enforcing business data rules through automation — and continuing to enforce them even as rules evolve over time — data management leaders can largely keep data errors at bay by establishing accountability over data and its accuracy.
There’s more than one path to success. A proven model for data managers is to tie data performance to executive’s benchmarks, set measurable metrics, assign accountable data owners and follow through with an effective remediation strategy.
Michael Collins is a global vice president at BackOffice Associates, a worldwide leader in information governance and data migration solutions, focusing on helping customers manage one of their most critical assets – data.
Sean Michael is a writer who focuses on innovation and how science and technology intersect with industry, technology Wordpress, VMware Salesforce, And Application tech. TechCrunch Europas shortlisted her for the best tech journalist award. She enjoys finding stories that open people's eyes. She graduated from the University of California.