Turning the lights on: Using data insights to improve manageability and compliance

Veritas Perspectives January 14, 2022

In the pursuit of greater organisational efficiency and market differentiation, data has become one of the most important assets an organisation holds. Yet many enterprises are hindered in mining that value by poor visibility into the data they possess. Executives seeking to drive power out of their data may be familiar with the classic Peter Drucker adage, "You can't manage what you can't measure." Until their organisation has clearer insight into what data they hold, how they manage their data, and who is accountable for making decisions over its ongoing retention, they will be unable to apply the correct analytics strategies to capture its value.

To explore the challenges around effective data management at a deeper level, I sat down with my colleague Peter Grimmond, Veritas' Head of Technology. Peter shared his views on how executives can address issues relating to the manageability, security, and compliance of their data practices. Together, we discussed a process executives can use to overcome common obstacles and devolve responsibilities for ongoing ownership and management. 

Shining a light on dark data 

Dark data is a recurrent problem across many large organisations. According to current estimates, as much as 90% of business data is 'dark,' meaning that the organisation lacks insight into what data they contain within their network. Much of the data may be redundant, obsolete, or trivial yet may still be held in high-cost storage.

It is often hard to identify the true owner of data, which means it is retained ad infinitum for fear of destroying critical business value.

Not only does this needless hoarding present a significant waste of resources, but it also opens the organisation to risks around security. Without proper identification and structure of the data, there's a continual risk of data theft, leakage, or ransomware incidents, resulting in significant financial and reputational consequences.

Executives also need to think about the compliance implications. Regulations such as General Data Protection Regulation (GDPR) and the UK Data Protection Act 2018 require organisations to be on the front foot of data retention and protection. It's no defence that any contravening data is a consequence of action or inaction by previous employees or administrations.

Overcoming common data management challenges

I asked Peter for his opinion on what's holding organisations back from implementing robust data management practices, hoping he could shed some light on why many organisations are struggling to establish control.

Based on his experience, Peter shared three common challenges he's observed across various industry sectors. The first challenge he noted relates to the complexity of modern data sources and processes, which span multiple dimensions, including employees, partners, geographies, devices, and the composition of infrastructures, both on- and off-premise. Without full visibility of the data ecosystem that exists inside and around the business, there's a significant risk that the volume of dark data will grow at a much faster rate and in an ever more dispersed set of locations.

The second challenge Peter identified relates to accountability. While objective measures can be applied to whether data needs specific treatment (such as databases containing personally identifiable information), much of the organisation's data will require a subjective assessment based on whether it serves defined purposes or goals. These assessments should be devolved to informed decision-makers, meaning that responsibilities must be allocated among relevant lines of business with identified individuals empowered to dispose of unnecessary data.

Thirdly, Peter noted a challenge related to manageability. To implement a more dynamic analytics strategy, organisations need to adopt a leaner approach to data. Less is more, and unnecessary hoarding can have a negative impact on the utility of critical data. Moving petabytes of data from a physical data centre to the public cloud can be a cumbersome and pointless process if most of the data is redundant. Indeed, the costs of moving unnecessary data can offset much of the intended financial benefit of migrating to the cloud in the first place.

Improving data management practices

From our conversation, Peter and I arrived at a process that leaders can adopt to improve their organisation's approach. This process operates on the principles of simplicity, consistency and responsibility. We separated the process into six steps:

  1. Gain visibility. Use insight tools to get visibility into where your data is, how much exists, how long it's been held for, and who's using it. By structuring the data and classifying it based on specific attributes and content, it should be apparent what's superfluous to the organisation's present needs.
  2. Have the confidence to delete what's not required. Residual or obsolete data is typically a liability, so getting the organisation's stored data to a manageable level will reduce risk and help improve efficiency.
  3. Build a quality policy framework. Use established industry best practices (such as GDPR) to develop a framework suited to the organisation's unique characteristics and data sovereignty requirements. This framework should include provisions for how data will be recovered in the event of data loss and time-bound requirements for how data will be periodically deleted to create legal defensibility.
  4. Partner effectively with business stakeholders. Data management doesn't just fall under the domain of the CIO or CTO. Leaders in operations, compliance, and corporate governance also have an important role in providing input into the design of robust data policies and processes and offering valuable support for their implementation and maintenance.
  5. Educate business data owners. Ultimately, the business and employees themselves will determine successful outcomes. Few could be excused for overlooking the importance of good data practices, given the recent prominence of landmark regulation and high-profile cases of data breaches and ransomware attacks. Use the policy framework to encourage the right behaviours and attitudes, supported by periodic education on risks and incentives.
  6. Establish an ongoing review of the policies. Be aware that data best practices will evolve. Policies need to be reviewed as regularly as is practical to make sure they continue to align with organisational needs and compliance requirements.

Linking insight to value

Once the organisation has a repeatable and sustainable data management framework in place, this creates the foundation to advance strategies designed to gain business value from the data. Not all data is created equally. While some data is inherently more valuable to the business, there's also an issue of time decay that should be considered. To be able to assess how much business value can be derived from any given set of data, there must be an owner of that data who is both informed and given license to maintain, store or have the confidence to delete based on their strategic intentions.

As many organisations that have taken the hard decision to shine a light on the often alarming depths of their dark data have found, a leaner, more manageable approach predictably leads to a better quality of insight. By setting in motion a reliable process for gaining greater insight into the organisation's data, the business is better positioned to be more efficient, manage risk better, and, ultimately, make better decisions.

If you want to discuss this topic further, please get in touch. In the meantime, take a look at our insights and compliance portfolio.

Mark Nutt
SVP, International Sales
VOX Profile