Data integrity is defined as the extent to which all data is complete, consistent, and accurate throughout the data lifecycle. Includes all original records and true copies, including source (raw) data, metadata, and all subsequent transformations and relationships of this data. Data requirements include that it is attributable, legible, contemporary, original and accurate (ALCOA).

Data integrity is not a new topic: it has been talked about for more than two decades, with FDA warning letters citing the lack of control of computer systems as early as the early 2000s. But why suddenly is it becoming such an important area of ​​attention for regulators?

Technological progress has allowed the introduction of numerous IT systems and solutions to support production processes. On the one hand, this means that increasing amounts of data are being generated, which can be used to understand processes as well as for quality and efficiency purposes. On the other hand, the expectations of regulators regarding monitoring and reporting tools and procedures have also increased. This has created a positive cycle of higher expectations and further (needed) advances in production IT solutions for data collection, aggregation and processing.

Consulting firm Deloitte analyzed FDA warning letters and found that most of them contain elements of data integrity. Furthermore, the trend has worsened in recent years. The four most common data-related infractions were:

  1. Data not fully and accurately documented.
  2. Critical deviations not investigated.
  3. Lack of proper access controls.
  4. Data not logged at the same time.

Furthermore, many data are isolated in the production systems and machines, without any connection with the production IT systems, and are therefore dispersed and cannot be managed according to the principles of "Data Integrity". Furthermore, such data is inaccessible for reporting and analysis purposes. The situation is even worse: according to market research firm Vanson Bourne, due to the data disconnect, workers on average spend more time searching for, acquiring, inserting or moving data (8 hours a week ) than to make decisions on that data (7 hours a week)!

That's why more and more life sciences companies are actively working to break down data isolation silos in their manufacturing, not only to solve data integrity and regulatory compliance issues, but also to make the most of the data they collect for process insights, reporting and analytics.

The solution lies in a central production data management platform

To connect all GMP-relevant data production sources, from existing data historians to dispersed data archives on machines, life sciences companies are increasingly opting for the introduction of a central data management platform of production. In this way, they allow for proper “Data Integrity friendly” data management and access to real-time (contextualized) production data for reporting and analysis.

The central contemporary data management platform shall provide the following functionalities:

  • Connectivity to various data loggers and data archiving databases, as well as the ability to provide automatic data capture from production devices/systems.
  • Central data management and contextualisation
  • Ability to enter manual inputs (where automatic capture is not possible)
  • High user security and comprehensive audit trail
  • Seamless exposure of data to reporting and analysis systems (including third-party ones).
  • With such a platform, you can meet all ALCOA requirements and achieve full data integrity.

Source: Most common data integrity violations in life science manufacturing and how to avoid them