Resources
Article

Data Quality Begins with Culture

John Muehling
April 12, 2024

In recent years, data has ascended to the position of “critical asset” for companies serious about success. But if the data is dirty, redundant, or missing key information, its value quickly dissipates, moving its position on the balance sheet from asset to liability. A culture of data quality represents an organizational ethos where data accuracy, integrity, and security are ingrained in the daily practices of every team member.

Without it, poor-quality data can lead to vulnerabilities, exposing the organization to data breaches, compliance failures, inaccurate reporting, and bad decisions.Daily operations become mired in inefficiencies as teams struggle to reconcile discrepancies and errors. Forecasting, market analysis, and the execution of even simple marketing initiatives, such as personalizing experiences, rely heavily on the integrity of data. When bad data infiltrates these processes, the results can steer companies away from opportunities and down a path of failure.

The solution to these risks lies less in implementing advanced data management tools and more in cultivating a culture of data quality by establishing standards and fostering education. Businesses that successfully cultivate a data quality culture position themselves ahead of the competition in several keyways. They enjoy enhanced decision-making capabilities, as decisions are based on reliable and accurate data. They achieve greater operational efficiency, as high-quality data streamlines processes and reduces the need for rework. Plus, they gain a competitive edge in customer satisfaction and loyalty, as insights drawn from quality data enable personalized experiences and services.

This culture elevates data from static assets to dynamic forces that drive decision-making, innovation, and customer satisfaction.

Here are the three steps we recommend for creating a lasting data quality culture. 

Step 1: Everyone Is a Data Steward

Creating a data quality culture begins with a fundamental shift in perspective:recognizing that everyone in the organization, irrespective of their role, is a data steward.

The DataGent    

This universal stewardship is crucial because data touches every aspect of the business. Whether directly interacting with data systems or indirectly contributing to the data lifecycle, each employee's engagement with data impacts its overall quality. As stewards, it is their collective responsibility to identify and address data issues before they escalate. Educating the entire company about the importance of data quality—and their part in maintaining it—is crucial.

At every level, the responsibility for data quality manifests in different ways. An intern, for instance, might not possess the technical skills or authority to rectify data errors directly. However, their ability to observe anomalies and escalate concerns is invaluable. Even the CEO plays a pivotal role in championing data quality. While it may be impractical for top executives to amend data entries personally, their actions and directives can significantly influence the organization's data culture. By highlighting inaccuracies and advocating for their correction, leaders set a precedent for the importance of data quality.

Fora culture of data quality to truly take root, it should be led by action.

If leadership overlooks or accepts reports derived from poor-quality data without demanding accountability, they are endorsing substandard data practices. It is important that every company member rallies around data and treats it as the company’s most valuable asset because it is! We all know bad data can produce bad decisions. This is especially important now, with AI coming onto the scene. If anAI model is trained with bad data, it will likely become an accelerator of bad decisions, something no organization can afford.

High-quality data is the key to staying relevant and implementing new cutting-edge technology.

By setting clear criteria for high-quality data, businesses can assess their current data state and score it against these predefined standards. This baseline serves as the foundation for continuous improvement, enabling organizations to track their progress through regular reporting. This progress should be shared with your teams. That way, all employees can work as a team to increase the quality of the company's data.

 

Step 2: Classify &Prioritize Data

Do you know what data impacts your business the most? The more important question is, are you sure that data is clean and highly reliable?

Understanding the hierarchy of data's importance enables businesses to allocate their resources and attention during stewardship, ensuring that the most crucial datasets are pristine and reliable. Classifying data into tiers is a great way to get started.  Those tiers can be labeled in various ways, but for our purposes, we’ll use “mission-critical,” “standard,”and “low-priority.”

Classifying and starting with mission-critical data will put companies far ahead of trying to fix all the data simultaneously. The old adage of “taking one bite of the elephant at a time” is particularly applicable when tackling a data quality project. Take these fixes in stages. While it's essential not to neglect lower-priority data, prioritizing mission-critical data ensures that the most impactful information assets are optimized for accuracy and completeness, laying a solid foundation for business intelligence and strategic initiatives.

A common challenge encountered by many organizations is managing free-form fields, such as the country name, in customer databases. Without standardization, these fields become breeding grounds for "dirty data," plagued by misspellings, inconsistent formatting, and erroneous entries. A practical solution to this issue is the implementation of controlled input methods, such as dropdown menus for country fields. This approach eliminates the variability introduced by manual entry and ensures consistency across the dataset. Drop down menus can be populated based on comprehensive, standardized lists that reflect the requirements of the organization's systems and processes.

But you can't guess the correct input for these lists. You have to examine your organization's data ecosystem. This includes understanding the specific needs of various systems that utilize the data—for instance, whether a system requires an ISO country code or the country's full name. Making these decisions requires a deep understanding of both the technical requirements of the organization's software and the operational context in which the data is used.

Data unification is a very important step in the process.

Using a solution to unify your data enables more efficient operations, especially when extensive data transformations are needed to meet the needs of a complex ecosystem. Let’s extend out the use case we started above using geographic data.

Imagine that for 98% of the platforms that use the Country value, the full name of theCountry is required (“United States,” for example). But for the remaining 2%,the data needs to be transformed to ISO 3166’s Alpha-3 code instead (“USA”)and, in one case, ISO 3166’s Numeric code (“840”). Without the data unified in a single repository and the ability to transform for the applicable down stream system, this becomes a particularly daunting task. Now imagine doing this at scale with 15 different downstream platforms. This should give you a sense of how important data unification and quality are to an organization.

Retailers run into this problem all the time. They might need to source products from two distinct providers—such as eggs from farms in the Midwest versusCalifornia—which may be cataloged under varying names or descriptions. The key to overcoming this challenge is carefully planning and setting data standards.Retailers must make strategic decisions based on these standards. Should they adopt the terminology of one primary source or tailor the data to fit the irvernacular? The choice here will significantly impact how products are received, inventoried, and finally distributed to their locations.

Planning this out doesn’t require years, but it does require focused strategic thinking to determine the most efficient way to categorize and prioritize data.

Step3: Enforce Data Culture Standards

You can’t suspend business operations to implement a data quality project. This is why you must build robust systems and frameworks that ensure data remains clean and reliable after initial quality improvements are implemented.

This is an ongoing process. It involves an ecosystem of tools, people, and platforms dedicated to the observability and management of data quality. These technologies are the bedrock upon which a sustainable data quality culture is built, enabling organizations to detect and address issues proactively. Butt hey need humans behind the strategy. We need to define who holds the authority to alter these standards clearly. This won’t be the intern or a middle manager.These should be data quality experts with the knowledge and experience to establish comprehensive standards and ensure consistent application across all data sets.

If they make a change, they will know where their data might be affected and can correct it before things break. Data management tools offer businesses the capability to monitor their progress in real-time so they can catch and correct these errors as they occur before they break anything.

Enforcing data quality through permission sets is a strategic method that ensures only authorized personnel can access, modify, or manage data, thereby safe guarding its integrity. This approach hinges on the principle of least privilege, which stipulates that individuals are granted only those permissions essential for their role-specific tasks. By meticulously defining and implementing these permission sets, organizations can effectively control the flow of data, minimize the risk of unauthorized access or erroneous modifications, and maintain a high standard of data quality. Moreover, permission sets allow for a granular level of control and audit ability, enabling data stewards to track changes, identify potential discrepancies, and take corrective action promptly.

In other words, if bad data is knocking at the front door, data stewards who manage data as their full-time job can refuse to let it in and dirty the system.

You Don’t Have to Go It Alone

We've repeatedly witnessed companies deferring the essential task of data management, opting instead to "kick the can down the road." However, with the rapid growth of AI technology, this can’t be ignored anymore. The complexities of today's data-driven business environment and the unforgiving pace of technological advancement mean we can no longer afford the luxury of procrastination. Those learning this lesson the hard way find themselves grappling with inefficiencies, missed opportunities, and, in some cases, significant reputation damage.

Luckily, you don’t have to create a data quality culture alone.

Data scientists and quality experts play a pivotal role in guiding organizations through the intricacies of modern data management. Through targeted training and workshops, they can equip teams with the skills and understanding necessary to uphold data standards long after the consultants have departed. 

This knowledge transfer ensures that maintaining data quality becomes a sustainable, ingrained part of the organizational culture rather than a temporary fix applied in response to immediate challenges. The goal is not just to remediate data quality issues as they arise but to prevent them proactively through education, empowerment, and the establishment of robust data governance practices.

Ready to get started? Contact us here
Tags
Culture
Data

Checkout our Latest Blog Posts

Learn more about data from all angles by checking out our library of resources.

What are data cleaning standards and why are they so important for your data migration and on-going data reliability needs?
Aaron Back
September 3, 2024
Understanding and implementing data reliability is a must for every business, but what is it and why is it necessary?
Aaron Back
July 11, 2024
With AI being deployed in every corner of business, this requires always-on, AI-ready data. However, the foundation of successful implementation lies in data readiness.
Aaron Back
July 30, 2024

Unleash Your
Actionable Intelligence

Having doubts about your data? Let's chat about trusted, quality data.