Time takes its toll on everything, and enterprise data is no exception. As databases expand and multiply, a growing number of organizations are facing the prospect of data decay.
Data decay is any data that’s not useful, states Kathy Rudy, chief data and analytics officer for technology research and advisory firm ISG. “This can include not only data that’s outdated, but incomplete, inaccurate or duplicative.”
Data Never Sleeps
If a house isn’t properly maintained, decay can claim it within just a few years, observes Goutham Belliappa, vice president of AI engineering at IT services and consulting firm Capgemini Americas. “Data decay occurs in much the same way, when a lack of maintenance and continuous attention lead to irrelevant data sets that are no longer useful or are disorganized.”
A typical example of data decay is when a sales or prospecting contact list fails to reflect the fact that key individuals have shifted roles or moved to a different company. “Interacting with decayed lists like this can waste up to 70% of an organization’s prospecting efforts,” Belliappa says. “On the other hand, if some of that energy were diverted to contact list curation, the interaction efficiency could increase by over 300%.”
Kathy Rudy, ISG
Data decay can also occur when files are improperly catalogued, particularly when the individuals responsible for retiring a vintage data group are unaware that the asset even exists, notes Robert Audet, director and data management leader at business and technology consulting firm Guidehouse. The same holds true when it’s unclear exactly who is responsible for retiring specific data assets.
Since decay is all but inevitable for many types of data, enterprises should consider deploying management and mastering strategies that are designed to keep pace with the fluid nature of enterprise databases. “Data entropy results in over 70% of B2B data decaying per year,” Belliappa observes. “For example, if B2B contacts are not managed for one year, less than one-third of the contacts will be relevant.”
The cost of bad data, including using outdated or incomplete entries to drive decisions, as well as wasting time trying to piece together data that’s full of holes and mistakes, is incalculable. Ensuring that data is right the first time will save an organization the inconvenience and cost of having to go back and fill in the gaps, Rudy says.
Since data decay can lurk invisibly, owners are frequently unaware of the exact types of data they’re actually storing, as well as which data access and protection controls methods are being used (if any). In such an environment, ignorance definitely isn’t bliss, particularly when it comes to data protection. “Data decay represents a massive cybersecurity risk and is likely the next frontier for potential data breaches and incidents,” Audet warns.
Management and Control
To minimize data decay’s negative impact, Audet advised organizations to establish a data governance framework that details the exact steps necessary to detect and correct or eliminate outdated or defective data. “Records retention schedules that are regularly refreshed ensure that roles and responsibilities related to data disposal are clearly defined and understood,” he notes. Audet also suggests that data decay should be addressed and remediated no less than quarterly using data cataloging tools, such as Collibra or Alation, and network monitoring tools, such as Splunk.
Robert Audet, Guidehouse
An organization can also detect and clean decayed data by regularly running reports that highlight data age, record completeness, and general data anomalies. Yet while technology can help point out errors, repairing faulty data remains a largely human process. “There are no silver bullets,” Rudy warns. Even advanced technologies and approaches have their limitations. “You can use AI to run queries against the data to look for mistakes and gaps, but humans have to decide what and how to fix decayed data or to remove it altogether.”
Many enterprises have tried to solve their data decay problems by purchasing the latest and greatest analytical tools while overlooking or disregarding the need to keep pace with changes among their contacts, clients, and suppliers, Belliappa notes. The result can be operationally and financially devastating. “Lack of competency and a lack of focus results in a lack of relevance and, ultimately, negatively impacts customers and revenue.”
Takeaway
Decay and entropy are natural states in the universe, Belliappa observes. “Just like we spend significant energies to prevent entropy and decay in real life, companies need to spend similar energies preventing data decay and keeping their data clean to stay relevant with their customers and suppliers.”
Related Content:
How to Weed Out Junk Data by Its Roots
Why You Need a Data Fabric, Not Just IT Architecture
9 Ways to Reduce the Environmental Impact of Data
Source link