article

Effective data quality governance: challenging five common myths

It is a well-established expectation that regulatory processes will become increasingly data focused, which places a new urgency on data quality governance in the pharmaceutical industry. In this Q&A Steve Gens and Preeya Beczek present a pragmatic view of what is involved.

Effective data quality governance

Everything in the pharmaceutical industry now seems to rely increasingly heavily on effective organisation and handling of data. But what needs to happen so that users/process owners can fully trust data’s quality, integrity, reliability, etc?

Steve Gens (SG): The more critical data becomes to regulatory procedures, to safety processes, to clinical research, to manufacturing, and ultimately connecting all of those parts of the value chain more seamlessly, the greater the need for formal and consistent data governance models and data management practices to support data across all internal and external touchpoints.

Top performers expected to have most of their systems connected and sharing data within the next two to three years”

When we last conducted our in-depth Regulatory Information Management (RIM) survey (of 76 life sciences companies, in 2022), the top performers expected to have most of their systems connected and sharing data within the next two to three years. Electronic trial master file (eTMF) systems, quality management systems (QMS), master data management (MDM), and enterprise resource planning (ERP) were highest priority for investment. Yet, without the assumption of high trust in the data with strong data governance, the risks can become intolerably high as companies’ dependency on the flow of good data broadens.

So, what’s the best way forward?

Preeya Beczek (PB): It can be tempting to create a major initiative supported by a large consulting budget – due to a lack of confidence in getting all of this right. But actually, it is more important that work starts now. Challenging some preconceptions can be very useful here.

A first myth is that data quality governance will inevitably be an overwhelming programme”

A first myth is that data quality governance will inevitably be an overwhelming programme. But all positive change has to start somewhere. It is important to decide whether a top-down or a function-by-function (with consistent practices) approach will produce the quickest wins and the greatest overall progress. What works for one company may not suit another, especially when considering the size of the product portfolio.

SG: The second myth is that complexity and high cost are unavoidable. The ‘data driven’ agenda might feel fresh and new in pharma, but digital process transformation is well advanced in other industries and solid frameworks already exist and have been adapted for data quality governance in a ‘Regulatory+’ context. In other words, this need not be a steep learning curve, or leave companies with huge holes in their transformation/organisational change/IT budgets.

Much of what is needed is around nurturing the right culture, assembling the right teams or assigning key roles”

PB: Much of what is needed is around nurturing the right culture, assembling the right teams or assigning key roles, communicating successes, and being on the same page as a company about the goals of this whole undertaking.

That brings us on to a third common myth, that companies are doing all of this largely because they have to. Compliance with Identification of Medicinal Products (IDMP) Substances Products Organisations and Referentials (SPOR) and other regulatory mandates might seem to be the most obvious driver for getting the company’s product and manufacturing process-related data in order. However, there are many higher purposes for making data-related investments. These range from more tightly-run business operations to a safer and more convenient experience for patients as consumers of existing and new products.

The tighter the controls around data quality, the more companies can do with their trusted data – use cases which could extend right out into the real world (such as prompter access to real-time updates to patient advice).

SG: Another preconception is that data quality governance is an IT / data management concern first and foremost. Whereas, time and again, the key success factors for a data quality programme are found to have little to do with technology, and everything to do with culture, organisation, and mindset.

Data quality governanceSpecific contributors to progress, distilled from the most promising programmes being rolled out today, include a shared data quality vision, for instance, so that good data-related practice becomes second nature. Another is establishing ‘actionable governance’ in the form of an assigned data quality officer, whose remit is to oversee efforts to clean up and maintain good data. Then there is a need to ensure that senior leaders advocate for a culture of data quality built into rewards systems; that executives drive a ‘right first time’ mindset around data as it is captured and first entered into a system.

Formal continuous improvement is important too – that is, continued rigour in raising the quality of data and making this consistent across the company over time. Underpinning all of this must be transparency of data quality performance, good communications about progress, and a plan for celebrating success as the quality and usability of data is seen to improve across the company.

Clues to issues when a company has challenges with its data quality…include data quality not being…linked to the organisational culture”

PB: Companies might assume too that, because they are already fairly vigilant about data quality, they do not need a formal governance programme. This is almost never the case. Clues to issues when a company has challenges with its data quality (which will deepen as data becomes increasingly fundamental to critical everyday processes) include data quality not being viewed as an organisational competency/linked to the organisational culture; a lack of clear data quality vision, policy, or strategy; and data connectivity being prioritised ahead of the organisational support required to properly leverage the value of that interconnected data.

What are the critical elements of a good data quality governance programme?

Data quality governance

A strong framework is key to data quality governance

PB: With a strong framework, any company can get started on the right track, whichever way they decide to approach this (eg bottom up – function by function, or top down and enterprise wide). In fact, programmes are more successful when you have both.

The Establish/Launch Phase – an initial ground preparation phase – is about setting out a data quality vision and principles; establishing an ’actionable’ data quality operating model with formal jobs (where needed) and defining/redefining roles and responsibilities; and conducting an awareness campaign.

The Operational Phase involves establishing optimal processes and capabilities – eg by adjusting to learnings from the Establish phase; ensuring that all roles with a bearing on data quality have these responsibilities set out in job descriptions and covered as part of the annual review process; and establishing recognisable rewards for high quality data.

Finally, in the Optimisation/Institutionalisation Phase, desirable behaviour is embedded and fostered within the organisational culture – ensuring that everyone gets and stays on board with maintaining and continuously improving data quality, to everyone’s benefit. Tools might include automated data quality dashboards to monitor KPIs; data integration and connectivity throughout function and organisation; and organisation-wide data quality level reporting, supporting a culture of quality.

Taking a phased approach to systematic data quality governance paves the way for companies to move forward with their improvement efforts”

SG: The key takeaway though is to get on with this as a priority. Taking a phased approach to systematic data quality governance paves the way for companies to move forward with their improvement efforts now, taking a bite-sized approach. As progress is witnessed, momentum should gather organically.

About the authors

Data quality governance - author headshotSteve Gens is the managing partner of Gens & Associates, a global life science advisory and benchmarking firm specialising in strategic planning, RIM programme development, industry benchmarking, and organisational performance.

Author headshotPreeya Beczek, Director of Beczek.COM, is an independent regulatory affairs expert, providing teams with valuable insights, advice and strategies for operational excellence, by optimising process, systems, roles and operating models.