Analysis of the LTDP Requirements of ALCOA

This chapter will cover:

  • Importance of ALCOA
  • Requirements for LTDP
  • ALCOA reminder

A set of requirements for a system used for long-term Data Integrity is provided below. These requirements have been derived from the ALCOA+ principles and an analysis of the regulatory guidelines for Data Integrity. Each ALCOA+ principle below includes the definition of the principle taken from the latest draft EMA guidelines[36]. The guidelines for Data Integrity from the FDA, EMA, MHRA and WHO emphasise the importance of a range of areas when achieving ALCOA compliance, including the use of audit trails, digital signatures, maintaining Data Integrity during transfers and migrations, the creation of true copies from originals, the handling of static and dynamic data, and other related areas. These requirements and guidelines are incorporated into the analysis below.

The requirements below focus on what’s needed to achieve long-term Data Integrity. They cover the archiving and long-term retention stages of the data lifecycle. Additional requirements will need to be met when the data is ‘live’.

Various systems can be considered when looking for a solution to the requirements. These include:

  • The system originally used to create, process, manage, and retain data when it was live, such as an eTMF system, a laboratory device, or other information system.
  • A centralised document or data repository within an organisation, such as a SharePoint server or ERDMS.
  • A dedicated data archiving and digital preservation system, for example deployed in house or provided as a cloud hosted service.

Given the requirements for long-term Data Integrity, it is unlikely that the requirements will all be met by leaving data in a live system or by using a centralised generic solution such as SharePoint. These systems are focussed on data creation and processing so may well be good homes for the data in the early stages of the data lifecycle, including as an initial home for archived data (this is recommended by the GXP guidelines in some cases such as for dynamic data); however, in the long-term it is likely that a specific digital preservation solution will be needed.

Below is a reminder of the ALCOA+ principles as outlined in the Data Integrity chapter.

ALCOA++ Principles
Description
Attributable
Data should be attributable to the person generating the data. Based on the criticality of the data, it should also be traceable to the system/device, in which the data were generated/captured. The information about originator (e.g. system operator, data originator) and system (e.g. device, process) should be kept as part of the metadata.
Legible, Traceable, Permanent

Data should be maintained in a readable form to allow review in its original context. Therefore, changes to data, such as compression, encryption and coding should be completely reversible to facilitate this.

Data should be traceable throughout the data life cycle. Any changes to the data, to the context or metadata should be traceable, should not obscure the original information and should be explained, if necessary. Changes should be documented as part of the metadata (e.g. audit trail).

Contemporaneous
Data should be generated by a system or captured by a person at the time of the observation. The information of the time point of observation and the time point of permanent save should be kept as part of the metadata, including audit trail. Accurate date and time information should be automatically captured and should be linked and set by an external standard (e.g. universal time coordinated (UTC), central server).
Original
Data should be the original first generation/capture of the observation. Certified copies can replace original data. Information that is originally captured in a dynamic state should remain available in that state.
Accurate
The use of computerised systems should ensure that the data are at least as accurate as those recorded by paper means. The coding process, which consists in matching text or data collected on the CRF to terms in a standard dictionary, thesaurus or tables (e.g. units, scales), should be controlled. The process of data transfer between systems should be validated to ensure the data remains accurate. Data should be an accurate representation of the observations made. Metadata should contain information to describe the observations and, where appropriate, it could also contain information to confirm its accuracy.
Complete
To reconstruct and fully understand an event, data should be a complete representation of the observation made and should be represented in the original context and associated metadata, including audit trail. Data where the original context or the metadata, including audit trail, were lost and/or detached is not complete.
Consistent
Processes should be in place to ensure consistency of the definition, generation/capturing and management (including migration) of data throughout the data life cycle. Processes should be implemented to minimise the risk of contradictions e.g. by the use of standardisation, data validation and appropriate training.
Enduring
Data should be maintained appropriately such that they remain intact and durable through the entire data life cycle, as appropriate, according to regulatory retention requirements.
Available
Data should be stored at all times in order to be readily available for review when needed.

LTDP Good Practice

Previous page

NDSA Levels

Next page