Beruflich Dokumente
Kultur Dokumente
Document title: GLP data integrity definitions and guidance for industry
Document submitted by the member from the United Kingdom
Background
Since the introduction of the OECD Principles of Good Laboratory Practice the way in which non clinical
health and environmental safety studies are conducted has continued to evolve in line with the
introduction and ongoing development of supporting technologies. The use of computerised systems
and the generation of electronic data is now common across all aspects of a GLP study, through
planning, performing, monitoring, recording and finally archiving. However the main purpose of the
OECD Principles of GLP for promoting the development of quality study data remains the same, and
having confidence in the quality and the integrity of the data generated and being able to reconstruct
studies remains a fundamental requirement that underpins the OECD Principles of GLP.
Introduction:
This document aims to provide guidance as to the GLP data integrity expectations that should be
considered in order to reflect the technological environment in which facilities involved in the conduct of
non-clinical health and environmental safety studies now operate. This guidance on data integrity is
intended to complement the existing OECD Principles of GLP along with the OECD consensus and
advisory documents.
In order to meet the requirements of operating in compliance with the Principles of GLP, the
arrangements in place at a test facility with respect to people (test facility management, Study Director,
study staff etc.), systems (QA, SOPs, study conduct etc.) and facilities (laboratories, archives etc.)
should be designed, operated and where appropriate adapted to support a working environment that
protects data in all its forms, i.e. paper and electronic; where the effort and resource applied to assure
the validity and integrity of the data are commensurate with the risk to the GLP compliance status of the
facility and the studies conducted. When taken collectively these arrangements fulfil the concept of
data governance.
It should be noted that data integrity requirements apply equally to manual (paper) and electronic data.
Test Facilities should be aware that reverting from automated / computerised to manual / paper-based
systems will not in itself remove the need for appropriate data integrity controls.
In addition to an overarching Data Governance system, which should include relevant policies and staff
training in the importance of Data Integrity, consideration should be given to the organisational (e.g.
procedures) and technical (e.g. computer system access, user/program options) controls applied to
different areas of the quality system. The degree of effort and resource applied to the organisational
and technical control of Data Lifecycle elements should be commensurate with its criticality in terms of
impact to the compliance status of the facility and the studies conducted.
Data may be generated by (i) a paper-based record of a manual observation, or (ii) in terms of
equipment, a spectrum of simple machines through to complex highly configurable computerised
systems. The inherent risks to Data Integrity may differ depending upon the degree to which data (or
the system generating or using the data) can be configured, and therefore potentially manipulated (see
figure 1).
Figure 1: Diagram to illustrate the spectrum of simple machine (left) to complex computerised
system (right), and relevance of printouts as original data
With reference to figure 1, simple systems (such as pH meters and balances) may only require
calibration, whereas complex systems require Validation - For Intended Purpose . Validation effort
increases from left to right in the diagram above. However, it is common for companies to overlook
systems of apparent lower complexity. Within these systems it may be possible to manipulate Dataor
repeat testing to achieve a desired outcome with limited opportunity of detection (e.g. stand-alone
systems with a user configurable output such as FT-IR, UV spectrophotometers).
Excluded Data:
Data generated during the conduct of a GLP study may only be excluded from the final report or when
determining the study outcomes where it can be demonstrated through sound science that the data is
invalid, anomalous or non-representative. All data (even if excluded from the final report) should be
retained, archived, and be available for review in a format which permits interaction with the data to
confirm the validity of the decision to exclude the data.
Systems should be designed in a way that encourages compliance with Data Integrity expectations.
Examples include:
Accessibility of clocks for recording timed events with consideration for their synchronization
where this is necessary for the validity of the data, e.g. HLPC acquisition times.
Accessibility of study notebooks and pro forma at locations where activities take place so that
the practice of recording on scrap pieces of paper and later transcription to official records is not
necessary
Data Integrity Definitions and Guidance Revision 1.1 March 2015 page 3
Control over free access to blank paper templates for raw data recording
User access rights to prevent unauthorised data amendments and allow for visibility of
legitimate amendments in addition
Automated data capture or printers attached to equipment such as balances
Proximity of printers to relevant activities
Access to Raw Data and supporting metadata for staff performing data checking activities.
The use of scribes to record activity on behalf of another operator should be considered exceptional
and only take place where:
The act of recording can potentially compromise the compliance of the activity at risk e.g.
ophthalmology slide scoring, necropsy of large animals.
To accommodate cultural or staff literacy / language limitations, for instance where an activity is
performed by an operator, but witnessed and recorded by another operator e.g. blood sampling.
In these situations, the person recording must be contemporaneous with the task being performed, and
must identify both the person performing the observed task and the person completing the record. The
person performing the observed task should countersign the record wherever possible, although it is
accepted that this countersigning step will be retrospective. The process for scribe documentation
completion should be described in an approved procedure, which should also specify the activities to
which the process applies.
Data Integrity Definitions and Guidance Revision 1.1 March 2015 page 4
Term Definition Expectation / guidance (where relevant)
Data Information derived or obtained from Raw Data (e.g. a Data should be:
reported analytical result)
A - attributable to the person generating the data
L legible and permanent
C contemporaneous
O Original record (or Verified Copy)
A - accurate
Raw Data All original test facility records* and documentation, or Raw data must:
verified copies thereof, which are the result of the original
observations and activities in a regulatory study, Permit the full reconstruction of the activities resulting in the
generation of the data
Be legible and accessible throughout the data lifecycle
Be retained in the format in which they were originally
generated (i.e. paper or electronic), or as a Verified Copy.
Be contemporaneously and accurately recorded by
permanent means.
Where multiply means exists for the concurrent generation of data the
organisation is expected to define which method is to be used. e.g. A
freezer that has a visual display, a chart recorder and is fitted with a
data logger all of which are displaying raw data concurrently, only one
of these methods should be defined as the raw data and if problem or
an anomaly were to exist with that chosen method it would then not
acceptable to choose another method on the basis that it gives a
more favorable outcome.
Metadata Metadata is Data that describe the attributes of other data, Example: data (bold text)
and provide context and meaning. Typically, these are data
that describe the structure, data elements, inter- 3.5 and metadata, giving context and meaning, (italic text) are:
relationships and other characteristics of data. It also
permits data to be attributable to an individual (or if sodium chloride batch 1234, 3.5mg. J Smith 01/07/14
automatically generated, to the original data source).
Metadata forms an integral part of the original record. Without
See also flat files metadata, the data has no context.
Data Integrity The extent to which all data are complete, consistent and Data integrity arrangements must ensure that the accuracy,
accurate throughout the Data Lifecycle. completeness, content and meaning of data is retained throughout the
data lifecycle.
Data Governance The sum total of arrangements to ensure that data, Data governance should address data ownership throughout the data
irrespective of the format in which it is generated, is lifecycle, and consider the design, operation and monitoring of
recorded, processed, retained and used to ensure a processes / systems in order to comply with Data Integrity
complete, consistent and accurate record throughout the expectations including control over intentional and unintentional
Data Lifecycle changes to information.
Data governance systems should also ensure provision for
timely access of data to national monitoring authorities Data Governance systems should include staff training in the
upon request. importance of data integrity and the creation of a working environment
that encourages an open reporting culture for errors, omissions and
aberrant results.
Data Lifecycle All phases in the life of the Data (including Raw Data) from Archival procedures should be in place, and the procedures for
initial generation and recording through processing destruction of data should consider data criticality and where
(including analysis, transformation or migration), use, applicable legislative retention requirements for long term retention of
reporting,Error: Reference source not found retention, relevant data in compliance with legislation.
archiving, retrieval, transfer and destruction.
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 6
Data Transfer / The process of transferring data and metadata Drafting note:
Migration between storage types, formats, or computer systems.
Original Record / Original record: Data as the file or format in which it was Original records and verified copies must preserve the integrity
Verified Copy originally generated, preserving the Data Integrity (accuracy, completeness, content and meaning) of the record. Exact
(accuracy, completeness, content and meaning) of the (true) copies of original records may be retained in place of the
record, e.g. original paper record of manual observation, or original record (e.g. scan of a paper record), provided that a
electronic raw data file from a computerised system documented system is in place to verify and record the integrity of the
copy.
Verified Copy: A copy of original information that has been It is possible for raw data generated by electronic means to be
verified as an exact (accurate and complete) copy having all retained in an acceptable paper or pdf format, where it can be justified
of the same attributes and information as the original. The that a static record maintains the integrity of the original data, for
copy may be verified by dated signature or by a validated example in HPLC analysis. However, the data that is retained must
electronic system. A true copy may be retained in a different be able to support full reconstruction of the raw data, i.e. provide the
electronic file format to the original record, if required, but metadata, relevant audit trail and result files, software / system
must retain the equivalent static / dynamic nature of the configuration setting specific to each analytical run, in order to
original record. preserve the accuracy, completeness content and meaning. It would
also require a documented means to verify that the printed records
Data may be static (e.g. a fixed record such as paper or were an accurate representation. This approach is likely to be
pdf) or dynamic (e.g. an electronic record which the user / onerous in its administration to enable a GLP compliant record.
reviewer can interact with).
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 7
application provides the ability to monitor, trend, and query
data, allowing the reviewer (with proper access
permissions) to reprocess, view hidden fields, and expand
the baseline to view the integration more clearly.
Computer System A computer system transaction is a single operation or Computer systems should be designed to ensure that the execution of
Transaction sequence of operations performed as a single logical unit critical operations are recorded contemporaneously by the user and
of work. The operation(s) that make up a transaction may are not combined into a single computer system transaction with other
not be saved as a permanent record on durable storage operations. A critical study activity such as animal dosing should be
until the user commits the transaction through a deliberate done within appropriate parameters to ensure control over all the
act (e.g. pressing a save button), or until the system forces steps.
the saving of data.
Example of dosing steps:
The Metadata (i.e., user name, date, and time) is not Animal identification by scanning of micro chip
captured in the system Audit Trail until the user commits the Animal body weight taken
transaction. Animal dosed
In computerised systems, an electronic signature may be Each of the above steps takes place in order, it would not be
required by the system in order for the record to be saved appropriate for the system to record a single time stamp for all three
and become permanent. dosing related activities.
Audit Trail Audit trails are Metadata that are a record of GLP critical Where computerised systems are used to capture, process, report or
information (for example the change or deletion of GLP store Raw Data electronically, system design should always provide
relevant Data), which permit the reconstruction of GLP for the retention of full audit trails to show all changes to the data
activities. while retaining previous and original data. It should be possible to
associate all changes to data with the persons making those changes,
and changes should be time stamped and a reason given. Users
should not have the ability to amend or switch off the audit trail.
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 8
Audit trail review should be part of any routine data review process,
usually performed by the operational area which has generated the
data e.g. review of the audit trial from the processing of a HPLC
analytical run performed by a member of lab staff . There should be
evidence available to confirm that review of the relevant audit trails
have taken place at the time of study report finalisation. When
designing a system for review of audit trails, this may be limited to
those with GLP relevance (e.g. relating to data creation, processing,
modification and deletion etc). Audit trails may be reviewed as a list of
relevant data, or by a validated exception reporting process. QA
should also inspect a sample of relevant audit trails, raw data and
metadata as part of their QA inspection programme to ensure on-
going compliance with the organisations Data Governance policy /
procedures.
Data Review In order that the Study Director and any Principal Investigators can
ensure that all data are fully documented and recorded there should
be a system for the review and approval of Raw Data. Data review
must also include a review of relevant Metadata, including Audit Trail.
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 9
Computerised It is acknowledged that some computerised systems support only a
System User single user login or limited numbers of user logins. Where no suitable
Access /System alternative computerised system is available, a paper based method
Administrator Roles
of providing traceability will be permitted. The lack of suitability of
alternative systems should be justified based on a review of system
design, and documented.
The individual should log in using the account with the appropriate
access rights for the given task e.g. a laboratory manager performing
data checking should not log in as system administrator where a more
appropriate level of access exists for that task
Data Retention Raw Data (or a Verified Copy thereof) generated in paper format may
be retained for example by scanning, provided that there is a process
in place to ensure that the copy is verified to ensure its completeness.
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 10
Secure controls must be in place to ensure the Data Integrity of the
record throughout the retention period, and Validation - For Intended
Purpose where appropriate.
Archive A designated secure area or facility (e.g. cabinet, room, Archive records should be protected such that they cannot be altered
building or computerised system) for the long term, or deleted without detection and Audit Trail.
permanent retention of completed Data and relevant
Metadata in its final form for the purposes of reconstruction The archive arrangements must be designed to permit recovery and
of the process or activity. readability of the data and metadata throughout the required retention
period. In the case of electronic data archival, this process must be
validated, and in the case of legacy systems the ability to review data
should be periodically verified, i.e. to confirm the continued
functioning of the legacy computerised systems.
Backup A copy of current (editable) Data, Metadata and system Backup and recovery processes must be Validation - For Intended
configuration settings (e.g. variable settings which relate to Purpose .
an analytical run) maintained for the purpose of disaster
recovery.
File Structure File structure has a significant impact on the inherent Data Integrity
risks. The ability to manipulate or delete Relational Database requires
a higher level of logical and procedural control over data generation,
Data Review and storage.
Flat File A 'flat file' is an individual record which may not carry with it Flat files may carry basic metadata relating to file creation and date of
all relevant Metadata (e.g. pdf, dat, doc). last amendment, but may not Audit Trail the type and sequence of
amendments. When creating flat file reports from electronic Data, the
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 11
metadata and audit trails relating to the generation of the Raw Data
may be lost, unless these are retained as a Verified Copy.
For example:
A pdf print out of an individual chromatogram from an HPLC run will
not show how many times the chromatogram was integrated and
saved and unless the baseline is expanded on the pdf print it may be
difficult to determine whether the integration was appropriate.
There is an inherently greater Data Integrity risk with flat files (e.g.
when compared to data contained within a relational database), in
that these are easier to manipulate and delete as a single file.
Relational Database A relational database stores different components of This file structure is inherently more secure, as the data is held in a
associated Data and Metadata in different places. Each large file format which preserves the relationship between data and
individual record is created and retrieved by compiling the metadata.
data and metadata for Data Review
A relational database saves data and metadata in different locations,
so that an individual record is compiled from different data locations.
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 12
and consistent with the requirements for computerised system user
access / system administrator roles.
Validation - For Computerised systems should comply with The Application of the
Intended Purpose Principles of GLP to Computerised Systems and be validated for their
intended purpose. This requires an understanding of the
computerised system's function within a process. For this reason, the
acceptance of vendor-supplied validation data in isolation of system
configuration and intended use is not acceptable. In isolation from the
intended process or end user IT infrastructure, vendor testing is likely
to be limited to functional verification only, and may not fulfil the
requirements for performance qualification.
GLP Data Integrity Definitions and Guidance Draft version 1 January 2016 page 13