TR84 Integrating Data Integrity Requirements into Manufacturing & Packing Operations
TR84 Integrating Data Integrity Requirements into Manufacturing & Packing Operations
84
Integrating Data Integrity Requirements into
Manufacturing & Packaging Operations
Integrating Data Integrity Requirements into Manufacturing & Packaging Operations
Authors
Contributors
ISBN: 978-1-945584-19-0
© 2020 Parenteral Drug Association, Inc.
All rights reserved.
Table of Contents
1.0 INTRODUCTION AND SCOPE ..............................1 4.5 Data Process Flow Maps .....................................19
1.1 Purpose........................................................1
1.2 Scope ...........................................................2 5.0 DATA INTEGRITY CONTROLS ............................21
5.1 Potential Differentiation of Controls ...................22
2.0 GLOSSARY AND ABBREVIATIONS .......................2 5.2 Methodology Used to Determine
2.1 Abbreviations .......................................................5 Differentiated Controls .......................................23
5.3 Areas of Differentiated Data Integrity Controls ...25
3.0 DATA INTEGRITY TRENDS AT INTERNATIONAL
DRUG MANUFACTURERS...................................6 6.0 CONTROLS FOR BIG DATA AS IT RELATES
TO DATA INTEGRITY........................................38
3.1 Historical Perspective ...........................................6
3.2 Regulatory Guidance ..........................................10
7.0 REFERENCES..................................................39
3.3 Industry Best Practices .......................................10
4.0 QUALITY RISK MANAGEMENT APPLIED 8.0 APPENDIX I: EXAMPLES — HOW TO USE THE
TO DATA INTEGRITY........................................11 9-BOX VULNERABILITY GRID ..........................40
4.1 Considerations in Assessing Risk ........................11 8.1 API Process Examples .........................................40
4.2 Data Integrity Risk Management Model .............13 8.2 Finished Dosage Form Examples ........................46
4.3 Data Vulnerability (9-Box) ..................................15 8.3 Sterility Assurance Examples ..............................50
4.4 Using the Data Vulnerability Grid .......................17 8.4 Tablet Packaging Process Examples ....................54
8.5 References for Appendix I...................................57
Figure 3.1.1-1 FDA Warning Letters–DI Issues Related Figure 5.2-1 Methodology Used to Determine
to Production Activities by Country of Differentiated Controls ......................23
Inspected Location ..............................7 Table 5.2-1 Categories Where Different Levels
Figure 3.1.1-2 Incidence of Warning Letters Citing of Data Integrity Controls are
Production Activities by Product Type Acceptable based on Criticality ..........24
(API vs Finished Pharmaceuticals) .......8 Table 5.3.1-1 Data Integrity Control Grid for
Figure 3.1.2-1 EU Non-Compliance Reports–DI..........8 Storage of and Access to Completed
Table 3.1.3-1 Examples of Data Integrity Issues in and Archived Paper Records ..............26
Production Operations cited by FDA Table 5.3.2-1 Data Integrity Control Grid for
and EU .................................................9 Issuance and Reconciliation of
Table 4.1.1-1 Human Factors Matrix .......................12 Paper Records ....................................27
Figure 4.2-1 Data Integrity Risk Management Table 5.3.3.1-1 Data Integrity Control Grid for
Model ................................................14 Data Accuracy when Manually
Recording without a Controlled
Table 4.2.1-1 Classification of Data Criticality .........15 Second Format ..................................28
Table 4.2.2-1 Data Control Levels ............................15 Table 5.3.3.2-1 Data Integrity Control Grid for Data
Table 4.3-1 Data Vulnerability Grid (9-Box) – Accuracy when Transcribing Manually
Example for Data Management Recorded Data into an Electronic
Technology Controls .........................16 System ..............................................29
Table 4.4-1 Potential Areas of Data Integrity Table 5.2.3.3-1 Data Integrity Control Grid for
Vulnerability......................................18 True Copy (Paper to Electronic) ..........30
Figure 4.5.1-1 Case 1: Granulation Operation Table 5.3.4-1 Data Integrity Control Grid for Access
(Paper-Based Manual Process Controls for Electronic Systems .........31
without Automated Alerts) ...............21 Table 5.3.5.1-1 ATRA Final Score and Frequency
Figure 4.5.1-2 Case 2: Granulation Operation Review ..............................................33
(PLC-Controlled Process with Table 5.3.5.2-1 Example of How ATRA Tool is Used
Automated Alerts).............................21 to Score a Filter Integrity Tester .........35
Table 5.3.6-1 Data Integrity Control Grid for Data Table 8.2-3 Example 6: Low Criticality – Humidity
Backup for System with Limited Monitoring During Sieving of a Drug
Storage Capacity................................37 Substance..........................................49
Table 5.3.7.1-1 Data Integrity Control Grid for Data Figure 8.3-1 Simplified Illustration of Aseptic
Export for Generation of Reports and Filling Process....................................50
Records..............................................37 Table 8.3-1 Example 7: High Criticality – Filter
Table 5.3.7.2-1 Data Integrity Control Grid for Data Integrity Testing at the Point of Fill
Transfer and Migration Between (Post-Use) .........................................51
Electronic Systems .............................38 Table 8.3-2 Example 8: Medium Criticality –
Table 8.1-1 Example 1: High Criticality API Filter Integrity Testing at the Point of
Reaction Controls Impurity Level .......42 Fill (Pre-Use–Post-Sterilization) ........52
Table 8.1-2 Example 2: Medium Criticality – Table 8.3-3 Example 9: Low Criticality –
Drying of API (LOD not a CQA)............43 Filter Identity Verification at the Point
Table 8.1-3 Example 3: Low Criticality – Small of Fill (Pre-Use Pre-Sterilization) .......53
Molecule API Batch-to-Batch Cleaning Figure 8.4-1 Overview of the Packaging Process ...54
of Equipment within a Campaign ......45 Table 8.4-1 Example 10: High & Medium
Table 8.2-1 Example 4: High Criticality – Humidity Criticality – Tablet Packaging ............55
Monitoring During Sieving of a Drug
Substance..........................................47
Table 8.2-2 Example 5: Medium Criticality –
Humidity Monitoring During Sieving
of a Drug Substance ..........................48
1.0 Introduction and Scope
Data integrity is the cornerstone of establishing and maintaining confidence in the reliability of the
data that assures product quality and patient safety. The reliability of manufacturing production
and control data depends on the procedures, systems, processes, and controls in place to ensure data
integrity. In production, these controls start at the point of data creation and continue through the
entire data lifecycle, including the storage of data and the ability to retrieve it later to support the
quality of the products manufactured.
The term “data integrity” has evolved to denote the degree to which data are complete, consistent,
enduring, and available (ALCOA+) as well as the degree to which the characteristics of the data are
maintained for all operations throughout the data lifecycle. Pharmaceutical manufacturers need to
ensure that relevant data are available to document and provide traceability to what occurred, that
the data are attributable to the person who performed the activity and entered the data, and that the
data cannot be deleted, omitted, or in any way modified to misrepresent what occurred. All breaches
of data integrity, whether intentional or unintentional, must be investigated.
Advances in information technology and in-process monitoring sensors, coupled with lower costs of
computer memory and better processors, has resulted in explosive growth in electronic data generation,
acquisition, and storage. Applying a one-size-fits-all approach to data integrity controls established on
principles developed decades ago may be neither valuable nor, in certain instances, feasible. Therefore,
the use of a risk-based approach is essential to developing a robust data integrity program, considering
the requirements from a data-lifecycle perspective. Risk-based concepts apply even as industry transitions
from manual documentation systems to fully electronic or hybrid (manual and electronic) systems.
This technical report addresses data integrity from the perspective of manufacturing operations. It
discusses regulatory trends, risk management concepts, and recommendations for implementing ap-
propriate data integrity controls in manufacturing operations applicable to paper-based, electronic-
based and hybrid systems. The case studies included in this technical report provide examples of how
to assess current data integrity risks and implement the concepts presented here.
1.1 Purpose
The continuous and rapid advances in automation and information technology, availability of
economical data storage, and superiority of electronic audit trail capabilities over paper records have
compelled the pharmaceutical industry to rethink good manufacturing practices (GMP) controls.
Among the consequences is a heightened awareness of the need to establish and maintain effec-
tive data integrity controls at every stage of the manufacturing process for drug products. While
the requirement to maintain accurate and complete data is well recognized by industry, what is not
universally understood is the level of data integrity control needed at each step to comply with GMP
regulations. Many companies continue to struggle when deciding what controls are appropriate for
each manufacturing operation and what levels of review and verification are necessary to ensure reli-
able manufacturing control data. This technical report describes an approach using quality risk man-
agement (QRM) for establishing and assessing the appropriateness of data integrity controls for each
manufacturing operation based on the criticality and vulnerability of the data for its intended use.
Developed by subject matter experts from global industry and regulatory agencies, this technical
report summarizes manufacturing data integrity risks and identifies best practices that can be used to
develop and sustain robust documentation as well as data integrity management procedures, systems,
processes, and controls. Employing these practices will help users achieve compliance with applicable
laws, regulations, and directives for pharmaceutical products such as active pharmaceutical ingredi-
ents (APIs), solid oral dosage forms, sterile injectables, biologics, and vaccines.
Assuring data integrity is an organizational responsibility. Employees at any facility that manufac-
tures, processes, packages, or holds a finished pharmaceutical, intermediate ingredient, or API are
responsible for ensuring that data collected throughout the manufacturing process are accurate and
reliable. Managers, quality assurance personnel, operators, technicians, and support staff alike must
1.2 Scope
The information in this technical report applies to the management of data at pharmaceutical facili-
ties that manufacture, process, package, or hold a finished pharmaceutical, API, or intermediate.
Specifically, it addresses data pertaining to manufacturing operations, materials, facilities and equip-
ment, production, and packaging and labeling, including in-process controls and process analytical
testing. It also applies to the procedures, systems, processes, and controls used at a drug facility to
ensure that the drugs conform to applicable laws, regulations, and directives and that the data sup-
port the drug’s identity, strength, quality, and purity.
The methods and processes described here can be applied to facilities that manufacture drugs intend-
ed for use both in clinical trials and for commercial distribution. The principles may be extended to
facilities engaged in other activities (such as distribution of finished products to customers) or prod-
ucts (such as components, raw materials, medical devices, and combination products), though these
were not a principal consideration. This technical report is not intended to apply to data integrity
management in clinical practice and the implementation of clinical trials. Data integrity manage-
ment in laboratory systems is discussed in PDA Technical Report No. 80: Data Integrity Management
System for Pharmaceutical Laboratories (1), and data integrity management in quality management
systems will be discussed in a future technical report.
2.1 Abbreviations
AHU/HVAC Air Handling Unit/Heating, ISPE International Society for
Ventilation, and Air Conditioning Pharmaceutical Engineering
GxP Good practice quality guidelines for TGA Australian Therapeutic Goods
various fields Administration
The assessment of data integrity procedures, processes, systems, and controls using a consistent,
structured approach and forensic techniques was not always a primary focus during FDA inspec-
tions. Periodically, the detection of egregious and unethical events causes FDA to redouble its
efforts and refocus its attention on lapses in data integrity. For example, during the late 1980s and
early 1990s, FDA uncovered data integrity violations at a large number of U.S.-based generic drug
manufacturers (and some innovator companies) in what was later dubbed the “generic drug scandal.”
The FDA discovered widespread falsification of production and control records and found that many
abbreviated new drug applications contained falsified data and other serious data integrity violations.
Several individuals from the industry and the FDA were criminally prosecuted, and the scandal
prompted FDA to establish in 1990 a pre-approval inspection program (12) that remains in effect
today. That program requires successful FDA pre-approval inspections as a condition of approval for
drugs and biologics; subsequently, other major health authorities established similar pre-approval
inspection programs. Then, in 1997, the FDA promulgated a regulation related to computer system
electronic records and signatures for digital data in 21 CFR Part 11 (13).
These developments impacted industry practices, too. Data integrity became a focus of companies,
industry conferences, and industry best practice documents such as the International Society for
Pharmaceutical Engineering (ISPE) good automated manufacturing practice (GAMP) guides and
PDA technical reports. Data integrity awareness and expertise increased among pharmaceutical
professionals, and the incidence of data integrity problems reported by FDA during its GMP inspec-
tions of domestic and international pharmaceutical manufacturers declined. Egregious data integrity
violations, however, still resulted in sanctions.
Starting in the late 1990s, the advent of the internet and growth in information technology fueled
increased globalization of pharmaceutical supply chains. This rapid expansion roused regulators’
concerns about data integrity. In 2007, FDA reported that it had found objectionable data integrity
issues at 3 of the 10 companies it had recently inspected on this topic (14). In the years immedi-
ately following, FDA again refocused its attention on data integrity practices. Since then, FDA has
included data integrity citations in more than 200 warning letters sent to companies in more than 20
countries.
Since 2012, inspections by the FDA, European Medicines Agency (EMA), Medicines and Health-
care products Regulatory Agency (MHRA), World Health Organization (WHO), and other health
authorities have revealed a notable increase in the incidence of data integrity issues at international
manufacturers of APIs and finished pharmaceuticals worldwide. The prevalence and significance of
the data integrity lapses discovered in recent years has led regulators to make data integrity a primary
focus during inspections, and regulators are detecting data integrity lapses with increasing frequency.
Pharmaceutical regulators continue to find conditions and practices that compromise data integrity,
among them human errors, inadequate management oversight, insufficient employee training, sys-
tem failures, inappropriate qualification or configuration of systems, poor procedures or not follow-
ing procedures, and intentional acts of falsification.
FDA has cited in recent warning letters questionable data integrity practices in manufacturing opera-
tions. For example, a warning letter issued in July 2019 cited that batch records included handwrit-
ten values routinely within process parameters while the values recorded by the programmable logic
controller (PLC) of the compression machine were frequently outside of the established process
parameters (15). A second example issued in December 2019 cited multiple discrepancies between
the human machine interface (HMI) data and the entries made by operators into batch records (16).
WLs with Production Related Data Integrity Citations by Country of Inspected Location
(22 Feb 2012 – 15 Aug 2019)
Figure 3.1.1-1 FDA Warning Letters–DI Issues Related to Production Activities by Country of Inspected Location
n API (35)
n FDF (58)
n Both (3)
Figure 3.1.1-2 Incidence of Warning Letters Citing Production Activities by Product Type (API vs Finished
Pharmaceuticals)
The data in Table 3.1.3-1 is based on observations by FDA and EU inspectors during their inspections
of quality management systems for production operations between February 2012 and August 2019.
Table 3.1.3-1 Examples of Data Integrity Issues in Production Operations cited by FDA and EU, 2012–2019
Risk should be controlled regardless of whether the data are managed using a manual system or an
automated electronic system. Any potential breach related to data integrity—particularly the loss,
omission, deletion, or alteration of data—must be investigated, addressed, and accounted for in the
quality system and, as necessary, reflected in the risk management documentation. Given that data
may be managed and/or transferred between different formats or systems, efforts should be made to
mitigate the risk to the integrity of the data during these transfers. Therefore, judicious application
of these concepts and targeted application of data integrity principles is warranted as the industry
transitions from paper to hybrid (manual and electronic) and, eventually, to fully electronic systems.
Data integrity risk also should be addressed during the transfer of data between electronic systems.
The PDA Technical Report No. 54 series describes QRM more broadly, including how to conduct
an appropriate risk assessment and formulate a mitigation strategy to reduce the risks identified. For
more information on QRM generally and other important elements, such as risk communication,
please see the TR 54 series and ICH Quality Guideline Q9: Quality Risk Management (8,26).
Organizations also should consider performing a risk assessment after a deviation or incident to
identify potential gaps in the data management system. Assessment of data integrity risks also should
be done in conjunction with the change management process of a data management system to avoid
creating new or additional risks. This will enable informed decision-making regarding the extent
of the issue and the potential impact. Key objectives of this assessment are to determine if the root
cause is related to a gap in data management and, if so, to identify the extent of the gap and the
impact to the data.
Data integrity controls should be considered from both the behavioral and technical perspectives.
From a behavioral perspective, the overall quality culture, including communications and training, is
intimately related to the effectiveness of the data integrity program.
what tasks should be performed and by perform a fraudulent act, e.g., falsifying
whom, e.g., lack of or inadequate stan- data for personal gain or avoid personal
dard operating procedures (SOPs) pain
Knowledge Gap
Error caused by knowledge gaps in how
to perform a task, e.g., lack of or inad-
equate training
quently performed action goes wrong priority, e.g., ignoring established controls
or multitasking or aggressive deadlines to compensate for aggressive target or
time pressure
Memory Failure
Error caused by taking no action, e.g.,
failure to perform a routine task due to
forgetting its place in the sequence
One category is the unintentional act. Risk may be unintentionally introduced in a variety of ways.
The unintentional act may take the form of an “action error” caused by a slip in attention or lapse
of memory, or a “thinking error” caused by gaps in governing procedures (knowing what or who) or
knowledge (knowing how). For instance, numbers might be reversed in an entry or entered in the
wrong spot due to a slip in attention. A bar code might have been applied incorrectly to materials
that are brought into the processing room, and the operation started before operators realize that the
materials were for a different batch. Another example could be that one operator sets up the equip-
ment believing it will be used for one product, but a different product is brought in and processed.
The other category is the intentional act. Risk may be introduced intentionally through misconduct,
by failing to follow established procedures and controls due to misplaced priorities or through mali-
cious acts. For example, to personally benefit or to avoid personal consequences associated with hav-
ing made the mistake, an employee may cover up a mistake by creating false entries showing that the
desired activity or result occurred. In one misconduct scenario, the operator was not paying attention
to the mixing or blending operation and allowed it to run for longer than the validated time, but
entered a value within the required range. Another example could be an overworked operator who is
struggling to keep up with the manufacturing schedule and, in an effort to catch up, reprints the last
acceptable in-process test result with the batch number of the new sample. Determining the reason
for an intentional or malicious act may not always be possible.
In the case of unintentional risks to data integrity, it will suffice to follow the process described for
risk assessment in this section. Provided the “right” people are in the room—whether operators,
technicians, mechanics, programmers, and/or supervisors—the risk assessment is likely to identify
the areas in which an accident in data capture and recording is possible. The purpose of the risk as-
sessment is to determine the probability of that accidental error occurring.
When the risk to data integrity is the result of an intentional action, a comprehensive investigation
is needed. Intentional manipulation of data is hard to detect and may require the use of forensic
detection techniques. The risk assessment process may uncover some of the circumstances in which
deliberate acts have occurred, but solid assurances that no repercussions will occur for contributing
such information during a risk assessment requires a great deal of trust.
Although automation is not the solution to every problem, it can be a useful tool as part of a risk
control strategy. Intelligent barcodes can be used to confirm product and component identity and
can prevent a process from proceeding until the error is rechecked and corrected. The time-honored
use of a second person checking or a supervisor reviewing is also important, provided that it is time-
ly. Frequently, the supervisory review takes place too long after an event to confirm that it occurred
as recorded. Moving further into the digital age, electronic automation will replace such reviews
when implemented and qualified appropriately.
In order to estimate the level of exposure to potential data integrity issues, the following elements
must be considered, as illustrated in Figure 4.2-1:
• Data criticality
• Current (existing) level of data controls, including:
– Data management – how data is recorded (manual or automated)
– Human factors – the amount of human intervention involved in the manufacturing process
(manual or automated)
– GMP process – including written procedures, training, validation, etc.
Thus, data criticality and current level of data controls, taken together, can aid in identifying the risk
of data vulnerability. Data vulnerability can indicate the level of exposure to data integrity issues due
to intrinsic weaknesses in the control of three key elements: data capture technology (data manage-
ment), human factors, manufacturing processes, or a combination of all three.
All data related to manufacturing is critical as it is necessary to allow for the reconstruction of actual
events; however, the criticality level may differ due to the data’s use and the controls under which it
was generated. Data that is related to product quality (or supports a disposition decision) and patient
safety would generally be classified as High criticality. Critical quality attributes (CQAs) and criti-
cal process parameters (CPPs) would be classified as High criticality. Data that is related to process
robustness and consistency would be considered Medium criticality. Process robustness is discussed
in detail in PDA Technical Report No. 54 and the other technical reports in the TR 54 series (26).
Data that is related to neither would be considered general GMP and of Low criticality.
Data criticality is a measure of the significance of a data point or data set in supporting the safety,
quality, identity, purity, potency, and/or effectiveness of the product being manufactured. The
manufacturer can best determine the criticality of the data generated by its own processes since it
knows the quality target product profile, CQAs, critical processes, and CPPs for the product being
manufactured (27). This concept also applies to clinical trial manufacturing. While CPPs and CQAs
may not be fully established in that context, initial information should be available to determine the
critical aspects of the manufacturing processes and existing controls.
Determine
Data Criticality
(Section 4.2.1)
1
While this technical report uses high/medium/low risk ranking levels, organizations may use dif-
ferent risk ranking levels and adjust their approach accordingly. Users would need to be mindful
not to compromise the intent of this framework.
When the intended use of the data directly impacts product quality and/or product safety:
• Product quality monitoring and control of processes that may be responsible for causing
variability during manufacturing, release, or distribution impacting critical quality attributes,
High critical material attributes, critical process parameters, or critical process controls, including
those that may be linked with the product registration dossier (where applicable)
• Product safety monitoring and control of processes that ensure effective management of
field alerts, recalls, complaints, or adverse events
When the intended use of the data relates to quality attributes, material attributes, process
parameters, key process parameters, or process controls that are not CQAs/Critical Processes
Medium (CPs)/CPPs and may or may not be in the product registration dossier; this includes parameters of
the manufacturing process that “may not be directly linked to critical product quality attributes but
need to be tightly controlled to assure process consistency” (28)
When the intended use of the data is to provide evidence of GMP compliance relating to
Low
monitoring and control of processes that do not fall into the High or Medium category
To determine the vulnerability of the data being collected, the user first must assess the level of data
criticality and the existing level of data control. Each will be assessed as High, Medium, or Low, us-
ing the definitions in Tables 4.2.1-1 and 4.2.2-1.
Subsequently, a 9-Box grid can be created by mapping the data criticality level with the data control
level. The 9-Box grid helps visualize how data criticality and data controls interact to rank levels of
risk. The criticality of the data will determine the row of the vulnerability grid, while the level of
control associated with the data will dictate the appropriate column.2
2
While this grid is based on the high/medium/low risk ranking levels, organizations may adapt it
to reflect local needs and to use different risk ranking criteria for data criticality and level of data
controls. The resulting grading of data vulnerability would likewise need to be adapted.
The objective is to have the right level of controls in place based on the criticality of the data, and
avoid being in the red or orange boxes. The goal of mitigation strategies, then, is to move from right
to left by increasing the level of data control. In this assessment, all data relevant to GMP is critical,
even if it relates to routine operations.
The 9-Box grid in Table 4.3-1 provides a visual depiction of the vulnerability of data given the
current level of data management controls and the criticality of the data. The 9-Box grid does not
provide any opportunity or cover for falsification of data.
Table 4.3-1 depicts only the data controls relating to data management technology. For a holistic assess-
ment of vulnerability, the data controls relating to human factors and the GMP process should be similarly
assessed. Appendix 1 provides further instructions for using the 9-Box grid, with illustrative examples.
Table 4.3-1 Data Vulnerability Grid (9-Box) – Example for Data Management Technology Controls
DATA CONTROL LEVELS
High Medium Low
(H/H) (H/M) (H/L)
Criticality Criticality Criticality
• CQA/CP/CPP impacting quality & • CQA/CP/CPP impacting quality & • CQA/CP/CPP impacting quality &
safety safety safety
Data Controls Data Controls Data Controls
• Validated & effective automated • Hybrid systems or manual data • Manual data capture, no
or hybrid data capture & analysis capture, limited automated automated data analysis,
system in place data analysis, manual data manual data transcription, heavy
transcription reliance on second-person
witnessing of data entries
(M/H) (M/M) (M/L)
Criticality Criticality Criticality
DATA CRITICALITY LEVELS
• Processes & process parameters • Processes & process parameters • Processes & process parameters
that are not CQA/CP/CPP but that are not CQA/CP/CPP but that are not CQA/CP/CPP but
need to be tightly controlled need to be tightly controlled need to be tightly controlled
Data Controls Data Controls Data Controls
• Validated & effective automated • Hybrid systems or manual data • Manual data capture, no
data capture & analysis system capture, limited automated automated data analysis, manual
in place data analysis, manual data data transcription, heavy reliance
• More controls than may be transcription on second-person witnessing of
required based on data criticality data entries
(L/H) (L/M) (L/L)
Criticality Criticality Criticality
• GMP compliance data that do • GMP compliance data that do • GMP compliance data that do
not fall into high or medium not fall into high or medium not fall into high or medium
criticality criticality criticality
Data Controls Data Controls Data Controls
• Validated & effective automated • Hybrid systems or manual data • Manual data capture, no
data capture & analysis system capture, limited automated automated data analysis,
in place data analysis, manual data manual data transcription, heavy
• More controls than may be transcription reliance on second-person
required based on data criticality • More controls than may be witnessing of data entries
required based on data criticality
The 9-Box also allows users to identify controls that could be implemented to decrease the risk to
data in the manufacturing process. Implementation of those controls could decrease data vulnerabil-
ity from a high level to a lower level (from right to left in the grid). Examples for implementing the
9-Box in specific scenarios are included in Appendix I.
This technical report recommends, as the first step, selecting a product and identifying the data that
is critical for that product to meet its approved specifications and its intended use (see Section 4.2.1.)
The critical data for the product should have been identified already as part of the product develop-
ment process and included in corresponding documentation. After identifying the data that impacts
the product CQAs and the processes that are critical in its manufacture, the user can prioritize internal
assessments, identify risk factors, and categorize the data vulnerability for these critical processes.
Because three main elements impact data vulnerability—data management technology, human fac-
tors, and GMP process—the next step is to consider these three categories when assessing the specific
data management process for the operation being evaluated. After completing this analysis for a
specific process or unit operation, the completed information may be leveraged for the same process
or unit operation used for another product, giving due consideration to potential differences.
For data management, the full lifecycle of the data needs to be considered to determine areas of
vulnerability. This includes how the data is captured (manual/automated/hybrid system), frequency
of collection, validation of any automated systems used to capture the data, migration of produc-
tion data to data analysis tools, data transcription, and data archiving. Here, again, existing data flow
maps from previous assessments can be leveraged.
For human factors, points of human intervention in the data lifecycle need to be considered to
determine data vulnerability. As people are prone to error, processes that rely heavily on manual data
capture and review are considered inherently more vulnerable than those that are automated. To deter-
mine the reliability of the current processes, the assessment should consider trends related to data tran-
scription errors and poor application of good documentation practices. Conversely, automated data
collection, properly validated, will be less prone to error and more likely to yield high quality data.
For GMP process, the complexity and robustness of the manufacturing process being evaluated
should be considered to determine areas of data vulnerability. More complex processes are likely to
produce a larger volume of data, which could increase the number of factors that impact the data
vulnerability. The assessment needs to look at all aspects of the manufacturing process and evaluate
each process on a case-by-case basis. A newer manufacturing process that has been validated recently
and that lacks a history of robustness will have greater data vulnerability than an established process
with a high-process capability. Trends related to change control, out of specification (OOS) results,
alarms, and out-of-calibration instances can also illustrate the vulnerability of data being collected
for the manufacturing process. Once identified, the trends should be investigated to determine their
relationship to data vulnerability.
GMP Process • Aborted runs (e.g., due to lack of planning, understanding system operation,
suitability of equipment and process)
• Complex process (e.g., many interfaces, high level of human intervention, high
levels of manual data entry)
• Data flows and ownership not well defined
• Inadequate line clearance checks
• Negative trends related to changes, deviations, out of specification, alarms, out of
calibration
• New or complex processes (causing repeat errors)
• Operation switching (GMP vs. non-GMP)
• Unavailability of quality-related data requested during regulatory inspections
Once the risk factors have been identified, the third step is to assess the existing controls as described
in Section 4.2.2. The controls in place should have been identified, for instance, by using a data flow
map as discussed in Section 4.5. A historical review of the information related to data integrity issues
contained in the quality management system (QMS) can be valuable in determining the adequacy of
the existing data integrity controls.
If the operation being evaluated fits into a red box (H/L) as shown in Table 4.3-1, the controls in
place are insufficient for the highly critical data and risk factors that have been identified. If it fits
into one of the two orange boxes (H/M or M/L), the data integrity controls in place for that opera-
tion could be improved, based on the criticality of the data and the risk factors identified. If the op-
eration fits into one of the green boxes (H/H, M/M, L/L), the controls in place are sufficient for the
criticality of the data and identified risk factors. If the process fits into one of the blue boxes (M/H,
L/H, L/M, L/L), the controls in place may be more than required for the criticality of the data and
the minimal risk factors identified. This does not imply that the controls for these processes should
be decreased, but that resources potentially could be shifted to other processes, such as those in the
orange and red boxes.
Points in the process where data integrity could be compromised should be identified. Asking a series
of critical-thinking questions at each stage of the data flow throughout the lifecycle can facilitate the
process, for example:
• Are unique accounts assigned to each user?
• Is an audit trail available that demonstrates if and when data is manipulated in any way, and
does it identify who has performed the operation?
• Is data held in temporary memory? What happens if there is no power to this unit?
For instance, unauthorized access to a system could potentially allow for manipulation of batch-re-
lated data. These points where data integrity could be compromised represent areas of risk. For each,
a risk assessment should be conducted, and mitigations to reduce risk identified and implemented.
Differences in detectability (including elements related to electronic vs. manual processes) should be
considered as part of the risk assessment.
1. Identify the key stakeholders involved in the process and hold a brief interview to obtain an
overview of the process.
2. Create a process data flow map to determine:
– If the data is captured in paper or electronic format,
– If the data flow takes place in real-time (solid line) or later (dashed lines),
– If automated printed outputs are available with out-of-validated-range alert, and
– If procedural controls are already in place or in use (red ovals in Figure 4.5.1-2).
3. Once the process/data flow is graphed, areas of data integrity vulnerabilities can be highlighted
and used as input to build a remediation plan.
Case 1 (Figure 4.5.1-1) depicts a paper-based, manually controlled granulation process with an end-
point of total granulation time that has been derived from process validation work, included in the
master batch record, and approved by the quality unit. Case 1 relies on the data being captured by
the operator, with second-operator verification, by wet ink recordings directly into the paper batch
production record in real time. In this example, the endpoint of the granulation operation is a CPP
that impacts the CQA of the finished product and would therefore be considered High-criticality
data. The existing level of controls would be considered Low because of the total manual operation
(see Section 4.2.2.). The only controls in this case are the second-person verification and a review by
the quality unit.
In Case 2 (Figure 4.5.1-2), the same validated granulation process has been automated. The total
granulation time has been included in a recipe, approved by the quality unit, and programmed into
a PLC that controls the process. This programming includes an automated alert when the validated
range is exceeded. The control level is now considered High because the validated PLC automated-
data collection system does not depend on any human intervention. With this level of control and
the PLC’s printout of the granulation process, which includes the data on the total granulation time
and an out-of-range alert, second-person verification is not necessary. The automated data collection
further protects the integrity of this High-criticality data by preventing changes to the total granula-
tion-time data in the system. The full batch production record is still reviewed by the quality unit as
part of the release process.
Automated data capture of the granulation time, printouts of the data, and automated alerts when
the total granulation time exceeds the validated range can reduce the procedural burden as well as
improve data integrity assurance. This becomes obvious when comparing the data flow of Case 1 to
that of Case 2. In Case 2, where the PLC controls the process, captures and stores the data, and gen-
erates a printout with automated alerts that is attached to the paper batch production record, there is
no need for peer review of the data on a per-batch basis.
Production
Operator 1 Confirm Initiate Deviation
Components
No
Enter Start & End Total time Yes Review BPR for
Times within VAL total granulation
(start granulation) range? time
Figure 4.5.1-1 Case 1: Granulation Operation (Paper-Based Manual Process without Automated Alerts)
Production
Initiate Deviation
Operator 1 Confirm
Components
Yes
Figure 4.5.1-2 Case 2: Granulation Operation (PLC-Controlled Process with Automated Alerts)
PDA recognizes that quality culture is critically important in the prevention and detection of data
integrity breaches; however, this technical report is designed to identify risk and potential mitigation
of a data integrity breach regardless of where on the spectrum of data integrity (system error, indi-
vidual error, individual or institutional malfeasance) that breach occurs. Of course, intentional fraud
and intentional manipulation of data are unacceptable under any circumstances, regardless of data
criticality, vulnerability, or existing level of controls.
In supporting and communicating expectations around a culture of quality for data integrity, PDA
recommends that entities use the PDA publication, Elements of a Code of Conduct for Data Integrity
in the Pharmaceutical Industry (22, 21), which outlines key elements necessary to help ensure the
reliability and integrity of data throughout all aspects of a product’s lifecycle. It can be used, in whole
or in part, as a control approach to guide a company’s internal practices, to create or modify an
An important part of preventing data integrity breaches is identifying, establishing, and maintaining
proper controls, whether the data elements are electronic, paper-based, or a combination of the two.
Ensuring data integrity means protecting the original data from accidental or intentional modifica-
tion, falsification, and deletion. Requirements specifying that data need to be ALCOA+ throughout
the data lifecycle have been included in regulations around the world for several decades. The WHO,
FDA, MHRA, and PIC/S have all released documents to reeducate the industry on current data integ-
rity concepts and expectations. These are the “what”—what is required to ensure the integrity of data.
Data integrity controls can be embedded in working procedures and practices or included as part
of computer system validation and equipment qualification or usage. These controls may be used to
minimize the potential that a data integrity issue will occur and to increase the probability of detect-
ing an issue that does occur. The data integrity controls are the “how”—how to ensure that these
requirements are implemented in practice.
Regulations convey the core elements that must be in place to ensure data integrity and a robust
QMS, including some of the following examples:
• Standard operating procedures (SOPs) are in place for completion and retention of records.
• People are suitably trained for their intended job purpose.
• Systems are validated or qualified for their intended purpose.
• People have access to the records required to perform their activities.
• If electronic signatures will be used, they should be attributable to an individual, unable to be
altered, permanently linked to the document, and have a time and date stamp.
• Data integrity risks should be identified as part of the QRM process and dealt with accordingly.
• A management review process allows data integrity issues to be elevated to the highest levels of
the organization.
• The data governance policy is endorsed at the highest levels of the organization, empowering a
strong data integrity culture at all levels of the organization.
• An anonymous reporting program, supported by company policy, encourages personnel to bring
breaches in data integrity to the attention of management.
• Routine data verification and periodic surveillance checks should be performed.
• Self-inspections should include verification of the effectiveness of the data integrity controls.
• A strong change management system is required.
This portion of the technical report focuses on only a small part of all data integrity controls required
by regulations. Most data integrity controls required by health authority regulations apply in the
same manner for any GMP data, that is, the controls do not differ based on the criticality of the
processes performed by the organization. Such controls are not discussed further in this technical
• GMP regulations require that entries in records should be indelible, made in the appropriate spaces,
identify the person making the entry and, if different, identify the person who performed the task(s).
Corrections, if required, should be dated and signed, leaving the original entry still readable, with a
comment added to explain the correction. Envision two different scenarios in which a person makes
a correction because of a typographical error: One person has recorded a critical parameter during
aseptic filling, while another has documented a training, a less critical activity. In both circumstanc-
es, the individuals involved should make the necessary correction, explain why the correction was
made, and sign with the time the correction was made, leaving the original entry still legible.
• GMP regulations also require that, in electronic systems, a user’s privileges should be limited to allow
the employee to perform the work, without any additional unnecessary privileges. Access to change
the date or time of the system, for example, should be limited to a very small group of people, ideal-
ly independent administrators. Regardless of whether the system is highly critical (e.g., granulation)
or less critical (e.g., secondary packaging systems that do not store data relating to quality attributes
or parameters such as expiry dates), the same controls should be in place to restrict access.
Any given system may have some controls for which it is possible to differentiate the level of controls
applied, and some controls that will be the same regardless of the criticality of the activity.
Data Integrity
Requirements
The authors identified seven categories of controls for which it is acceptable to differentiate the level
of control applied, as shown in Table 5.2-1. Each category was assessed to determine whether the
data, record, or system was being controlled and then was further scrutinized to determine which
controls within that category may be differentiated. For example, based on criticality, the technical
report team considered whether it is acceptable to differentiate between the frequency of control
(daily vs. monthly), who performs the control (manufacturing vs. quality unit), or what needs to be
controlled (more-detailed vs. less-detailed review).
Where differentiation is acceptable, details of how the controls differ based on criticality are dis-
cussed in detail in Section 5.3. The level of controls required may be applied at the data, record,
or system level, depending on which of the seven categories are being discussed. Any requirements
within a category that remain consistent across the criticality levels are considered basic elements in
ensuring a robust QMS and are out of scope for this technical report.
Table 5.2-1 Categories Where Different Levels of Data Integrity Controls are Acceptable based on Criticality
Type of
Category of Control Short Description of Category
System
Controls associated with storage of and access to
Storage of and access to
1 completed / archived paper records, and who has
Paper
Sections 5.3.1–5.3.7 have been developed with manufacturing operations in mind; however, many
of the concepts discussed can also be applied to other functional areas (e.g., analytical laboratories).
Sections 5.3.1–5.3.7 detail how differentiated controls may be applied for different levels of criticality.
Most of the controls discussed are prevention controls, but a few detection controls are included as well.
Some of the differentiated data integrity controls should be applied at the system level, others relate to
records, and some are relevant to data points. Each table provides the necessary details concerning detec-
tion vs. prevention and the level where the control should be applied (data point, record, or system).
Those sections provide examples of critical activities designated as High, Medium, and Low for il-
lustrative purposes and are not prescriptive; the examples are intended to stimulate thinking. Each
entity should assess its own data and apply the concepts in this technical report to determine the
controls necessary for their operations. In addition, suggestions are provided throughout, illustrative
only, regarding the frequencies at which certain activities should take place. Organizations should
determine the frequencies that are most applicable to their operations using a risk-based approach to
ensure that the periodicity of activities is commensurate with the risks identified.
Sections 4.3 and 4.4 discuss that, in the data vulnerability model, the risks associated with data
management, human factors, and GMP process must be identified, and the controls in place must
be assessed to confirm they are sufficient for the risks identified. Sections 5.3.1–5.3.7 primarily
discuss data management controls.
The controls discussed below would result in data being placed in one of the green boxes (H/H;
M/M; L/L) of the 9-Box grid (Table 4.3-1). That is, the controls would be adequate for the critical-
ity of the data and the identified risk factors. If fewer controls are in place than those recommended
in Sections 5.3.1–5.3.7, the level of data management controls is not adequate for that level of data
criticality. On the other hand, if more controls than required are in place, the data vulnerability in
these specific areas is very low.
The controls discussed in Sections 5.3.1–5.3.7 are only one part of the controls that need to be
considered to assess the overall data vulnerability. Organizations also must consider the data manage-
ment controls, human factor controls, and GMP process controls that apply without any possible
differentiation in level of control. Whether the levels of control required are differentiated or not, all
controls must be described in SOPs.
All documents should be stored in a medium that will not degrade the readability of the records
upon storage. Documents of High and Medium criticality, including batch records and validation
protocols, should be stored in a climate-controlled environment with a card key or restricted access
and a logbook recording the removal and return of records. An annual review of who has access to
the room is recommended.
For documents with Low criticality, such as completed user-access authorization forms where evi-
dence of training exists in another system, storage in an office retention location, such as a local file
cabinet with limited access to the file cabinet keys, would suffice. Who has access to the keys should
be reviewed on a bi-annual basis.
documents checked-in/
Returned removal and return of the removal and return of the
checked-out
record record
Card Key access or Card Key access or
limited key access with limited key access with
Access Control Limited key access
entry documented in entry documented in
logbook logbook
Periodic User Access
Annually Annually Every 2 years
Review
Highly critical records, such as batch records, should be issued by a designated unit and have a
unique identifier. The printing of these records should be controlled, i.e., a specified number of
people authorized by the quality unit to print, with full traceability of the number of copies printed,
when they were printed, and by whom. Full reconciliation of the record, including extra sheets or
additional pages, is required after use. Destruction procedures for unexecuted (blank) forms should
be in place and may be performed by the issuing unit.
For Medium-critical documents, like the form used for calibration, a unique identifier for the record
is not required. Controlled printing should ensure that the templates are printed by a limited num-
ber of people authorized by the quality unit. While an unlimited number of copies of the template
is allowed, a process should be in place to avoid misuse, including use of expired or outdated forms.
Full reconciliation of records and pages based on the quantity issued is warranted. Destruction
procedures for unexecuted (blank) forms should be in place and may be performed by the operating
unit or issuing unit.
Low-critical documents, for example, the blank forms used for recording training activities, may be
printed by the end-user without a unique identifier, and no reconciliation activities are required after
use. The number of copies that can be printed is not limited, and any unexecuted (blank) forms may
be discarded by the user without the quality unit being directly present in the process.
Even though the quality unit does not need to be present for the destruction of unexecuted (blank)
forms, it must maintain oversight to ensure that procedures are followed (e.g., through approval of
relevant SOPs and/or audit).
Acceptable quality unit oversight may constitute all or part of the following:
• Quality unit is responsible for approving the procedure,
• Processes are reviewed in practice as part of the self-inspection program, and/or
• Processes are reviewed in practice as part of the routine quality unit walkthrough of manufacturing.
Full reconciliation of
Full reconciliation of records
records and pages
Reconciliation and pages based on unique No reconciliation
based on quantity
identifier
issued
Controlled Print Required Required Not required
Yes, with process
Bulk printing allowed? No in place to avoid Yes
misuse
Performed by the Performed by the
Performed by the issuing
operating or issuing individual, quality
Destruction of blank forms unit, quality unit oversight
unit, quality unit unit oversight
required
oversight required required
5.3.3 Manual Activities when Recording or Transferring Data between Paper and
Electronic Formats
Hybrid data or records are used in many manufacturing operations. An example of this is a system
that captures and stores electronic raw data and generates a printed report that is used during the
documentation review and approval process. When this happens, the data in the electronic system and
the printed report (e.g., times, temperatures, critical parameters, pass/fail results) must be the same.
In some situations where hybrid systems are employed, it is acceptable to differentiate the levels of
controls required based on the criticality of the activity. For manual activities, when recording or
transferring data between paper and electronic formats, three scenarios were identified:
5.3.3.1 Data Accuracy when Manually Recording Data without a Controlled Second
Format
5.3.3.2 Transcription of Manually Recorded Data into an Electronic System
5.3.3.3 True Copy (Paper to Electronic)
The controls are applied at the data level for the first and second scenarios, and at the record level
for the true copy scenario. All controls are designed to prevent a potential data integrity issue from
occurring.
5.3.3.1 Data Accuracy when Manually Recording Data without a Controlled Second Format
Table 5.3.3.1-1 applies when a manual recording (recording on paper or direct manual recording
into an electronic system, like a manufacturing execution system (MES)) is made without a con-
trolled second format available. This means that a reading, measurement, or result is observed from a
display or another source, and no audit trail, printout, photo, or other media is available to corrobo-
rate the data recorded. These recommendations are intended for use while an organization is mov-
ing toward technological improvements that allow for a controlled second format such as electronic
audit trail, printout, photo, or other media.
Low-critical data points, for example the start time of a secondary packaging operation for
products not time- or temperature-sensitive, only require a general check by a peer to en-
sure adherence to good documentation practices. A second-person check for accuracy is not
required.
Table 5.3.3.1-1 Data Integrity Control Grid for Data Accuracy when Manually Recording without a Controlled
Second Format
For High-criticality data, like a CPP, second-person verification against the initial recorded data must
be performed by a supervisor or the quality unit, as described by quality oversight procedures. The
second-person review should ensure that the data entered manually into the electronic system cor-
responds to the original raw data. As original raw data is available (print out/manual recording), the
second-person review need not be performed at the time of the activity but must be completed prior
to batch disposition.
For Medium-criticality data, like a reading of environmental data for a compression operation, the
second-person verification may be performed by a peer. For Low-critical data, like the time of start-
ing manufacturing, no second verification is needed.
For High-criticality records, such as paper batch records that may be needed for downstream use by
a sister site or another department, a trained user can make the true copy, but a second person from
the quality unit must perform a documented review. The review must ensure the scan is legible, ac-
curate, and complete (all pages, all entries) and confirm that all entries on the true copy provide all
information and data that are contained on the original record. For example, if data is color-coded
or highlighted, the true copy must reflect the same color-coding or highlighting, with the intent of
capturing the context and meaning. In addition, the original record may be discarded with quality
unit oversight, unless the original document has a seal, watermark, or similar identifying mark that
cannot be accurately reproduced electronically, in which case it cannot be discarded. The quality unit
should determine the level of oversight required for destruction, which may range from quality unit
presence during destruction to periodic confirmation of compliance with procedures.
For Medium-criticality records, like a validation protocol, a trained user can make the true copy. A
documented second-person review is required, though the second person need not be from the qual-
ity unit. The original record may be discarded by the operating unit unless the original document has
a seal, watermark, or identifying mark. While the quality unit does not need to be directly involved
in the destruction process, it must retain a suitable level of oversight to ensure the procedures are
followed.
For Low-criticality records, such as completed user access authorization forms where evidence of
training exists in another system, the person making the true copy must document that they re-
viewed the scan for legibility, accuracy, and completeness. The original may be discarded by the
individual, with quality unit oversight to ensure that procedures are followed.
As with all controls relating to data integrity, processes relating to true copies, regardless of critical-
ity, must be documented in SOPs. A record must be maintained to reflect when each document was
destroyed and who destroyed it.
completeness
Discard of original Yes, as defined by Yes, performed by Yes, individual can
allowed quality unit oversight, the operating unit, discard original
unless there is a seal, unless there is a seal, Quality unit oversight
watermark, or other watermark, or other required
identifier that can’t be identifier that can’t be
accurately reproduced accurately reproduced
electronically. electronically.
Quality unit oversight
required
For systems storing High-criticality data, such as an MES, historian, or filter integrity tester used
during sterile filling, unique identification and authentication controls should be in place for each
individual user. Other features also should be available for use, such as mandatory password changes
every 90 days, lock-out after five failed attempts, password recycling only allowed after 10 cycles, and
an annual review of user accounts.
For systems storing Medium-criticality data, like parameter-setting or executed run files for a wash-
ing machine, many of the controls for systems storing Highly critical data likewise should be ap-
plied. Certain measures can be less stringent: mandatory password changes can occur every 180 days
(instead of 90), passwords can be recycled after 5 cycles (instead of 10), and users locked-out after 10
incorrect tries (instead of 5).
For Low-critical systems, for example, an HMI on a secondary packaging line that controls conveyor
speed but does not store data, the access controls may be far less rigorous. For these types of systems,
a passcode or group accounts are acceptable. Controls must be in place, however, to ensure segre-
gation of duties between user levels (i.e., administrator vs. engineer vs. operator). The passcode or
group accounts should be changed annually. These controls may also be used for other Low-critical
activities that do not require the action to be attributable. For example, a group account may be used
to press the start/stop button on a packaging line when the system requires a log-in to activate the
stop/start function.
Automatic log-out is not discussed as a differentiated control but should be considered as an overall
control factor. The amount of time that should be allowed to pass before an inactive user is automati-
cally logged-out should be commensurate with the risks identified, and documented accordingly
As stated above, the suggested frequencies at which certain activities should take place are illustra-
tive only. Organizations should determine the frequencies that are most applicable to their opera-
tions using a risk-based approach to ensure that the periodicity of activities is commensurate with
the risks identified. This assessment should consider the overall system architecture, the interfaces
between systems, criticality of access by unauthorized persons, and the potential for network security
breaches.
frequency
Periodic user account
Annually Annually Every 2 years
access review
Account lock-out after
5 incorrect password 10 incorrect password
repeated incorrect Never
entries entries
password entries
Password recycling
– Reuse of previously 10 cycles 5 cycles N/A
used passwords
Secure, computer-generated, time-stamped audit trails must allow for reconstruction of the history
of events relating to the record, including the “who, what, and when” of the action. Where techni-
cally possible, comments should be added, in the case of modifications or corrections, so that the
why of the activity also is recorded as part of the audit trail. Where not technically possible to record
in the audit trail the reasons for changes or deletions to GMP-relevant data, alternate documentation
should be used.
The electronic audit trail may comprise multiple sources related to the equipment; that is, the infor-
mation may be held in files called something other than “Audit Trail,” for example, event log, alarm
log, system log, history file, trends, or reports. Understanding where the information is recorded
within the system, at both the operating system and application level, is important to help protect
the data and understand where in the system to look when performing an audit trail review.
In the following sections, audit trail review is discussed for three different types of data contained
within the electronic audit trails:
• Data associated with batch setup, for example, the recipe used, and the set-points or times
planned for specific activities during the batch
• Batch run data, which is data from the manufacturing process, for example, temperatures, dura-
tion of activities, who performed the activities
• System configuration data, which are not batch-relevant data but the data associated with the
configuration of the system, for example, user-account information, privileges granted to users,
where data is stored
Reviewing audit trails enables detection of potential data integrity issues. Some regulations require
that audit trails be reviewed on a periodic basis (2-3). There is no expectation that every part of the
audit trail will be reviewed for every batch; rather, it is expected that a risk-based approach will be
taken to identify the appropriate level of review to conduct.
The ATRA focuses on the detectability of changes to the data, the criticality of the data (in terms of
patient safety or product quality), and the probability or capability for changes to the data to occur.
A cross-functional team should be assembled to conduct the ATRA. The team should comprise
people who fully understand the technical capabilities and limitations of the system, how the system
is set up (from a user management perspective), where the data is stored, what type of data is part of
the audit trail. The team also should include users who are familiar with operating the system.
The first step in the process is to score each of the three factors—detectability, severity, and prob-
ability—that may require review based on the system controls in place for each element in the audit
trail. For each factor, a lower score represents lower risk. Examples of scores that may be used are
provided in this section, although each entity may modify the scoring based on its own analysis and
risk tolerance.
• Detectability Score is based upon the likelihood that modification or deletion of data will be de-
tected. There are two possible Detectability Scores: 1 or zero.
If it is possible to detect any modifications or deletions to the data contained within the audit
trail, and this data is already reviewed as part of an existing review as required by SOP, perform-
ing additional review of the audit trail to detect potential data integrity issues is not required.
For example, if the batch report lists the critical parameters used during manufacturing and
shows any changes made to these parameters during batch manufacturing, an audit trail review
of the same information need not be conducted. (This assumes that the batch report is generated
automatically from the equipment and cannot be manipulated.) Score = 0.
If the data has not already been reviewed or has only been partially reviewed (i.e., not all criti-
cal data is in scope of the batch-wise data review), further assessment of the audit trail may be
required. Score = 1.
• Severity Score is based upon the data criticality; it assesses the potential impact if data is modified
or deleted. There are three possible Severity Scores: 10, 4, or 1.
If the data is considered Highly critical or has a direct impact on patient safety or product qual-
ity (e.g., changing the temperature to make it appear it was within the validated range, despite
being outside the range), the potential impact is of high severity. Score = 10.
If the data has an indirect impact on patient safety or product quality, i.e., changes to data that
result in a compliance risk only, with indirect impact to only product quality or patient safety
(e.g., changes to time zone where the raw data and batch-related data is not affected), it is of
medium potential severity. Score = 4.
If the data has negligible or no impact on patient safety or product quality (e.g., the start time of
an operation, provided that this is not a CPP and does not affect duration), it is of low potential
severity. Score = 1.
• Probability Score is based on the capability of the data to be modified or deleted by system users
and administrators. There are three possible Probability Scores: 10, 2, or 1.
If the data cannot be modified or deleted by a user but can be modified or deleted by an inde-
pendent administrator (i.e., an individual with no direct interest in the data), the chance that
such changes will occur is lower. Score = 2.
If the data cannot be modified or deleted by any user or administrator, it is considered a very low
probability. Score = 1.
Next, the three scores (Detectability, Severity, and Probability) are multiplied to provide the ATRA
final score. The lower the score, the lower the risk.
Table 5.3.5.1-1 illustrates how the final score can be used to determine the needed frequency of
review. Some examples are provided in Section 5.3.5.2 that highlight how this can be used together
with different elements of the audit trail to determine the full audit trail review requirements for a
system.
Table 5.3.5.1-1 ATRA Final Score and Frequency Review
The timeframes indicated are illustrative only. Organizations should assess their systems and apply
critical thinking when considering the criticality of the activity being performed as well as the poten-
tial for impact on product quality and patient safety. Using a risk-based approach, the periodicity of
reviews should be commensurate with the risks identified.
If during periodic review a number of data integrity breaches are discovered, the controls that are
(or should be) in place and/or the frequency of the review should be reevaluated to prevent future
breaches or to detect breaches sooner.
Elements for the batch setup data and batch run data parts of the audit trail, where data controls
may be vulnerable and may have an impact on critical batch data, might include:
• Were any of the recipe parameters modified outside the validated range during the execution of
the batch?
• Were changes made to CPPs (set points or allowed tolerance ranges) during manufacturing?
Were these within the validated ranges?
• Were the calculation parameters edited or modified?
• Were changes to the time made that impact the duration of (time-sensitive) parts of manufactur-
ing, i.e., an operation ran longer or shorter than was expected?
Similarly, for the system configuration data, a number of elements may be vulnerable and should be
considered when using the ATRA tool, such as:
• Were changes made to the system date or time?
• Was the report template or any batch report modified, edited, deleted, or renamed?
• What changes have been made to user access accounts or the privileges assigned at the user level?
Do these changes still ensure segregation of duties?
• Were any edits, modifications, or deletions made to the system configuration (e.g., administra-
tion changes, database settings, change of data storage location, system settings or functionality)?
• Were there changes, edits, or deletions of audit trails, history logs, or files? Were any renamed?
• Was any unusual activity recorded in the user activities log (login, logoff, unauthorized logins)?
Systems with high technical capabilities that are being fully utilized (e.g., strong access control, segre-
gation of duties) are less vulnerable, therefore, a significantly reduced frequency of audit trail review
is acceptable. On the other hand, if the system has limited technical capabilities (e.g., no difference
between administrator and operator privileges), the system is more vulnerable, and a more extensive
review of the audit trail would be appropriate.
Table 5.3.5.2-1 contains a few examples of how the ATRA tool can be used for a filter integrity
tester. The example does not represent a full ATRA review for a filter integrity tester but covers a few
audit trail elements for demonstrative purposes to clarify the use of an ATRA.
The scores in these examples are based on the definitions in Section 5.3.5.1. A brief explanation is
added to justify the scoring. The scores for detectability, severity, and probability are multiplied, giv-
ing a Final Score. Then, the data in Table 5.3.5.1-1 is used to determine the frequency of audit trail
review for that element (Outcome).
Completing the ATRA for the other elements of the audit trail will allow the user to more fully un-
derstand the parts of the audit trail that require review and the necessary frequency of that review.
The first scenario relates to the frequency of performing the backup of the manufacturing data stored
electronically. Companies should consider developing a principle-based backup strategy in order to
differentiate the frequency of backups. This strategy should take into account the company’s risk
tolerance, i.e., how manufacturing activities can be recreated in the event of a system failure when
data has not been backed up. If the only evidence of an activity that impacts the safety and/or quality
of the product is through the lost electronic data, the company must be prepared to reject the batch,
and a more frequent backup may be warranted. On the other hand, if the major activities can be
recreated using other means (e.g., printouts, manually recorded information), that may be used to
justify a less-frequent backup of electronic data.
In addition, the volume of data being generated within a given time period and the capability of
the electronic system to store that volume of data without overwriting existing data should also be
considered. This situation will be further discussed in the second scenario.
If the equipment is capable of automated backups, backups are recommended at the same frequency
at which data is generated. If new data is being generated daily (e.g., using an MES or historian), a
daily backup is warranted.
The second scenario is associated with looped memory. For the purposes of this technical report,
looped memory is used in an electronic system with a limited storage capacity. Once the capacity
limit is reached, new data automatically overwrites older data. For example, in a compression ma-
chine capable of storing 10 batches worth of data, the data for Batch 11 automatically overwrites the
data for Batch 1.
Table 5.3.6-1 describes controls for systems with a looped memory. For systems storing Highly
critical data, like certain makes or models of filter integrity testers used in sterile filling operations,
the backup should occur before the data is overwritten. A similar expectation should be in place for
systems storing Medium-critical data, such as a pH meter used on the shop floor. For systems storing
Low-critical data, like secondary packaging systems that do not store data relating to quality attri-
butes or such parameters as expiry dates, the backup should ensure that the data will not be overwrit-
ten before batch disposition has occurred.
Prevention Control Backup of Electronic Data High Data Criticality Medium Data Criticality Low Data Criticality
For reports and records related to Medium-critical data, such as a building management system
monitoring temperature and humidity, IT or automation personnel should perform an initial valida-
tion. Any changes to the report template should be performed following SOPs, but revalidation of
the template would not be required. That the report contains limited flexibility (via filters or drop
downs) and is working as intended is acceptable, so the user may run these reports based on a pre-
defined set of filters or criteria.
For reports and records displaying Low-critical data, verification governed by work instructions
should suffice. The business unit creates these reports and records, and their content may be flexible
in nature, for example, exporting data into a data lake for future non-GMP analysis.
Regardless of the criticality, the quality unit should review and approve changes to report templates,
especially for any changes that could impact the information displayed.
Table 5.3.7.1-1 Data Integrity Control Grid for Data Export for Generation of Reports and Records
For High- or Medium-critical data transfers and migrations, the transfer of the electronic data should
be prepared using full change control, with a test plan in place to ensure that all data is transferred.
It also is advisable to use a verification tool, such as a checksum, to confirm that all data sent was
received. Two examples of this type of transfer are presented in this section.
One example is the migration of data from one platform to another as part of a software or hard-
ware upgrade, for example, when moving from one software version to another, moving from an old
server to a new server, or moving to the cloud is one type of transfer. This is generally considered a
once-off activity specifically associated with the upgrade.
A second example is the transfer of High- and Medium-critical batch data (e.g., CPPs, calibration
records, environmental monitoring data) from a standalone system to an upper-level system like an
MES or data historian. This type of data transfer occurs on a defined periodic basis (e.g., once a day,
once a week, after each batch) based on how the data transfer has been set up. For both examples,
the transfer or migration of the data should be safeguarded by change control, a test plan, and a tool
to verify that all the data have been transferred or migrated.
Data of varying criticalities are regularly transferred into different formats for making continuous
process verification conclusions, for example, transferring data into a data lake for further trending or
analysis. These are considered Low-critical transfers, so a procedural control is adequate for describ-
ing the steps to be taken to transfer the data.
Table 5.3.7.2-1 Data Integrity Control Grid for Data Transfer and Migration Between Electronic Systems
Data Transfer and Migration High Data Criticality Medium Data Criticality Low Data Criticality
Data migration – How
Prevention Control
A forthcoming PDA technical report will provide additional guidance on governance and controls
for big data in manufacturing. That technical report will address topics such as data classification,
data ownership, access management, source data, results and data insights, and change management,
within the framework of governance and controls of big data in manufacturing.
Collectively, these two technical reports will provide guidance on the management of data and big
data in a regulated pharmaceutical or biopharmaceutical facility.
12. U.S. Food and Drug Administration. FDA 23. International Society for Pharmaceutical
Compliance Program Guidance Manual; Engineering. GAMP Guide: Records & Data
For each unit operation of the manufacturing processes, the user must assess the criticality of the
data and the existing data integrity controls in place. The criticality of the data will be either High,
Medium, or Low as defined in Section 4.2.1 Classification of Data Criticality. Table 4.2.1-1 con-
tains definitions of the classes of data criticality. The criticality of the data determines the row of the
9-Box vulnerability grid in which the data will reside. As shown in Table 4.3-1, High-critical data is
in the top row, Medium-critical data in the middle row, and Low-critical data in the bottom row.
For the purposes of this technical report, the existing data integrity controls will be considered as
High, Medium, or Low based on the definitions in Table 4.2.2-1. The level of existing data in-
tegrity controls associated with the manufacturing process will determine in which column of the
9-Box vulnerability grid the data belongs. Manufacturing processes with a High level of data integ-
rity controls in the form of computerized controls reside in the first column (moving from left to
right); manufacturing processes with a Medium level of data integrity controls in the form of hybrid
systems reside in the second column; and manufacturing processes with a Low level of data integrity
controls in the form of manual records reside in the third column.
As the user moves down the grid from top to bottom, the criticality of the data decreases. As a user
moves across the grid from left to right, the level of control decreases. The goal of decreasing the
vulnerability of the data is not impacted by the criticality of the data; instead, the goal of decreasing
the vulnerability of the data is achieved by increasing the controls.
Example 1 involves a reactor in an API manufacturing process (Table 8.1-1). The reaction needs
to be precisely controlled to maximize the production of the API intermediate while controlling an
unwanted and difficult-to-remove impurity that will impact a critical quality attribute (CQA) of the
Example 1a illustrates High-criticality data with manual operation of the reactor, manual capture
of data, and manual transcription of data by the operator into a paper manufacturing record. In
addition, the data controls for this process are a Low level, as they rely on second-person witnessing
of the manufacturing operation, human oversight of the manufacturing process, and human review
release process. The High criticality of the data combined with the Low level of controls yields a
Significant data vulnerability risk (Red), as defined in Section 4.3.
Example 1b illustrates the same High-criticality data but with the reactor better controlled using a
validated programmable logic controller (PLC). The data integrity controls are increased with the
PLC printouts included with the batch processing record (BPR) for the review and release process,
rather than relying on the manual transcription of data from the human machine interface (HMI) of
the PLC into a paper batch production record. Therefore, Example 1b yields Moderate risk of data
vulnerability (Orange).
Example 1c illustrates the same High-criticality data but with the use of additional controls: a vali-
dated electronic batch recording (EBR) with automatic data capture; the inability to erase any data;
the use of access controls, activated audit trails, and automated data archiving; and the ability to
trend the electronically stored production data. With these additional controls employed, the vulner-
ability of the High-criticality data has reached an Acceptable level (Green).
Ex Mfg Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Manually controlled reactor and High: Impacts CQA & is a CPP that Human Factor: Manual transfer Low:
paper BPR controls impurity formation of CPP data from production Human Factor: Supervisory review &
instrumentation & gauges into paper quantity unit review at release
BPR.
1a GMP Process: Controlled with Significant
GMP Process: Reactor is manually written procedures and training
controlled by operator
Data Mgmt: Reliance on second-
Data Mgmt: Paper BPR person witnessing of data entries &
quality unit review at release
PLC-controlled reactor and paper High: Impacts CQA & is a CPP that Human Factor: Manual transfer of Medium:
BPR controls impurity formation CPP data from HMI into BPR Human Factor: Supervisory review &
GMP Process: PLC-controlled quality unit review at release
reactor of new or complex process GMP Process: PLC validated with
1b Data Mgmt: Manual data entry into printout attached to paper BPR Moderate
paper BPR from HMI; no audit trail Data Mgmt: Reliance on second-
or electronic analysis of data in PLC person witnessing of data entries;
quality unit review of BPR & PLC
printout at release
PLC-controlled reactor & electronic High: Impacts CQA & is a CPP that Human Factor: Electronic BPR High:
BPR controls impurity formation mitigates human error to a residual Human Factor: EBR reviewed by
level quality unit at release
GMP Process: PLC-controlled GMP Process: PLC validated
1c reactor of robust process Acceptable
& production data captured
Data Mgmt: Data automatically automatically
captured in EBR & data mgmt Data Mgmt: Access controls on
lifecycle is controlled PLC; electronic analysis of data;
automatic data storage & audit trail
Example 2 covers the oven drying of an API (Table 8.1-2). The loss on drying (LOD) of the API is not a CQA and the drying operation is not a CPP, but the
LOD does impact the flow characteristics of the API, which affects the downstream processing of the API into the finished product. Therefore, the data regarding
the drying of the API is of Medium criticality and belongs in the middle row of the 9-Box vulnerability grid.
Example 2a describes Medium-criticality data with the manual operation of the drying oven, manual data capture, and manual transcription of data into a paper
manufacturing record. In addition, the existing data integrity controls are Low-level, with reliance on second-person witnessing of the manufacturing operation,
human oversight of the manufacturing process, and human review at release. The Medium criticality of the data plus the Low level of controls yields Moderate data
vulnerability (Orange), as described in Section 4.3.
Example 2c illustrates the same Medium-criticality data but with more than the required controls in place: a validated EBR with automatic data capture, the in-
ability to erase any data, activated audit trails, automated data archiving, and the ability to trend the electronically stored production data. These controls yield a
Negligible level of data vulnerability (Blue).
Table 8.1-2 Example 2: Medium Criticality – Drying of API (LOD not a CQA)
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Low:
In Example 3a, the cleaning is performed manually, then the Low-critical data is captured by the
operator who transcribes it into the paper manufacturing record. The reliance on second-person wit-
nessing, human oversight of the manufacturing process, and human review at release provide a Low
level of data integrity controls. The Low criticality of the data plus the Low level of controls yields
Acceptable data vulnerability (Green).
Example 3b illustrates the same Low-critical data but the level of controls in place are more than
required. The process is better controlled through a validated clean-in-place (CIP) process. The data
integrity controls have increased with the PLC printouts from the CIP system, which are included
with the BPR for the review and release process. The additional controls yield Negligible data vulner-
ability (Blue)
Example 3c illustrates the same Low-critical data but with the addition of a validated EBR with
automatic data capture, inability to erase any data, activated audit trails, automated data archiving,
and ability to trend the electronically stored production data. The vulnerability of the data remains
unchanged, and the vulnerability remains at a Negligible level (Blue). In this example, there are
significantly more controls in place than are required.
Data Vulnerability
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place
Level
Manual batch-to-batch cleaning Low: Batch-to-batch carryover Human Factor: Manual recording of Low:
of equipment within a campaign will not impact the quality of the cleaning into paper BPR Human Factor: Supervisory review &
final API for validated campaign GMP Process: Equipment cleaning quality unit review at release
length manually performed by operator
3a GMP Process: Controlled with written Acceptable
Data Mgmt: Paper BPR procedures and training
Data Mgmt: Reliance on second-person
witnessing of data entries & quality
unit review at release
CIP batch-to-batch cleaning of Low: Batch-to-batch carryover Human Factor: Manual transfer of Medium:
equipment within a campaign will not impact the quality of the cleaning data from HMI into BPR Human Factor: Supervisory review &
final API for validated campaign GMP Process: PLC-controlled CIP quality unit review at release
length cleaning process
3b GMP Process: PLC validated with Negligible
Data Mgmt: Manual data entry into printout attached to paper BPR
paper BPR from HMI; no audit trail or Data Mgmt: Reliance on second-person
electronic analysis of data in PLC witnessing of data entries; quality unit
review BPR & PLC printout at release
PLC-controlled CIP batch-to- Low: Batch-to-batch carryover Human Factor: EBR mitigates human High:
batch cleaning & electronic BPR will not impact the quality of the error to a residual level Human Factor: EBR reviewed by
final API for validated campaign GMP Process: PLC-controlled CIP quality unit at release
length cleaning process is robust GMP Process: PLC validated & cleaning
3c Negligible
Data Mgmt: Data automatically data captured automatically
captured in EBR & data mgmt lifecycle Data Mgmt: Access controls on PLC,
is controlled electronic analysis of data, automatic data
storage & audit trail
In Example 4, humidity is a CPP that will impact CQA (Table 8.2-1). The drug substance is hygro-
scopic and will degrade with elevated levels of moisture. Therefore, the High-criticality data goes in
the top row of the 9-Box vulnerability grid (Table 4.3-1).
Example 4a illustrates High-criticality data with the operator manually inserting analog humidity
data from a chart recorder and transcribing this data to a paper manufacturing record. A Low level
of data controls exist, relying on a second person to verify the data. The High criticality of the data
combined with the Low level of controls creates Significant data vulnerability (Red), as defined in
Section 4.3.
Example 4b illustrates the same High-criticality data. In this example, the data controls have in-
creased compared to Example 4a. The humidity data is now digitized and captured using a validated
building management system (BMS), using audit trail and access controls. Also, a printout of the
humidity data is attached to the manufacturing record. The additional controls yield Moderate data
vulnerability (Orange).
Example 4c illustrates the same High-criticality data with the additional controls of a validated
electronic manufacturing record with automated data capture, audit trail, and access controls. These
additional controls yield Acceptable vulnerability (Green).
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Humidity data is displayed and High: Humidity is a CPP that Human Factor: Analogue Low:
recorded on a paper chart and impacts CQA; drug substance humidity data is added from Human Factor: Supervisor review
a paper manufacturing record is is hygroscopic and will degrade chart recording, and data is and quality unit review at release
used with elevated moisture levels transcribed to paper record
4a GMP Process: Chart paper is Significant
GMP Process: Humidity data is attached to paper record
captured manually
Data Mgmt: Second-person
Data Mgmt: Periodic manual verification of data entry and
entry of data into BPR quality unit review at release.
Humidity data is displayed and High: Humidity is a CPP that Human Factor: Digital humidity Medium:
recorded using BMS, and a impacts CQA; drug substance data is printed out from HMI, Human Factor: Supervisor
paper manufacturing record is is hygroscopic and will degrade and data is transcribed to paper review and quality unit review at
used with elevated moisture levels record release
GMP Process: Humidity data GMP Process: HMI printout
4b is captured both manually and is attached to paper record; Moderate
electronically validated BMS with audit trail
Data Mgmt: Periodic manual and access controls
data entry into paper BPR Data Mgmt: Second-person
verification of printout and
quality unit review at release
Humidity data is displayed and High: Humidity is a CPP that Human Factor: An electronic High:
recorded using BMS, and a impacts CQA; drug substance manufacturing record mitigates Human Factor: Use of electronic
paper manufacturing record is is hygroscopic and will degrade human error to a residual level record minimizes manual entry
used with elevated moisture levels GMP Process: Humidity data is of data
captured electronically GMP Process: Validated BMS
4c Data Mgmt: Humidity data is with audit trail and access Acceptable
automatically uploaded into controls; automated data
electronic manufacturing record capture in electronic record
Data Mgmt: Humidity data
is managed per Part 11
requirements
Example 5a illustrates Medium-criticality data with the operator manually interpolating analogue humidity data from a chart recorder and transcribing the data
to a paper manufacturing record. In addition, a Low level of data controls exist, relying on a second person to verify the data. The Medium criticality of the data
combined with a Low level of control creates Moderate data vulnerability (Orange).
Example 5b illustrates the same Medium-critical data, but the data controls have increased compared to Example 5a. The humidity data is now digitized and
captured using a validated BMS, with audit trail and access controls. Also, a printout of the humidity data is attached to the manufacturing record. The additional
controls yield Acceptable data vulnerability (Green).
Example 5c illustrates the same Medium-criticality data with the added controls of a validated electronic manufacturing record with automated data capture, audit
trail, and access controls. The additional controls yield Negligible vulnerability (Blue).
Table 8.2-2 Example 5: Medium Criticality – Humidity Monitoring During Sieving of a Drug Substance
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Humidity data is displayed and Medium: Humidity is not a CPP; Human Factor: Analogue humidity data Low:
recorded on a paper chart; a paper at elevated humidity levels, drug is interpolated from chart recording, Human Factor: Supervisor review and quality
manufacturing record is used substance will become sticky, giving and data is transcribed to paper record unit review at release
poor flow properties
5a GMP Process: Humidity data is GMP Process: Chart paper is attached to Moderate
captured manually paper record
Data Mgmt.: Periodic manual entry Data Mgmt: second-person verification of
of data into BPR data entry. Quality unit review at release
Humidity data is displayed and Medium: Humidity is not a CPP; Human Factor: Digital humidity data Medium:
recorded using BMS; a paper at elevated humidity levels, drug is printed out from HMI, and data is Human Factor: Supervisor review and quality
manufacturing record is used substance will become sticky, giving transcribed to paper record unit review at release
poor flow properties GMP Process: Humidity data GMP Process: HMI printout is attached to
5b Acceptable
is captured both manually and paper record; validated BMS with audit trail
electronically and access controls
Data Mgmt: Periodic manual data Data Mgmt: Second-person verification of
entry into paper BPR printout and quality unit review at release
Humidity data is displayed and Medium: Humidity is not a CPP; Human Factor: An electronic High:
recorded using BMS; an electronic at elevated humidity levels, drug manufacturing record mitigates Human Factor: Use of electronic record
manufacturing record is used substance will become sticky, giving human error to a residual level minimizes manual entry of data
poor flow properties GMP Process: Humidity data is GMP Process: Validated BMS with audit trail
5c Negligible
captured electronically and access controls; automated data capture
in electronic record
Data Mgmt: Humidity data is
automatically uploaded into Data Mgmt: Humidity data is managed per
electronic manufacturing record Part 11 requirements.
Example 6a illustrates Low-critical data with the operator manually adding analogue humidity data from a chart recorder and transcribing this data to a paper
manufacturing record. In addition, a Low level of existing data controls exist, and a second person must verify the data. The Low criticality of the data combined
with a Low level of controls creates an Acceptable data vulnerability risk (Green).
Example 6b illustrates the same Low-critical data; however, the data controls have increased compared to Example 6a. The humidity data is now digitized and
captured using a validated BMS, with audit trail and access controls. Also, a printout of the humidity data is attached to the manufacturing record. The additional
controls yield a Negligible data vulnerability risk (Blue).
Example 6c illustrates the same Low-critical data with the added controls of a validated electronic manufacturing record with automated data capture, audit trail, and
access controls. The vulnerability of the data remains unchanged at a Negligible level (Blue). In this example, significantly more controls are in place than are required.
Table 8.2-3 Example 6: Low Criticality – Humidity Monitoring During Sieving of a Drug Substance
Ex Mfg Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Humidity data is displayed and Low: Humidity is not a CPP; drug Human Factor: Analogue humidity data Low:
recorded on a paper chart; a paper substance is not hygroscopic is interpolated from chart recording, and Human Factor: Supervisor review and
manufacturing record is used. and is very stable with good flow data is transcribed to paper record quality unit review at release
properties over a wide humidity
6a GMP Process: Humidity Data is captured GMP Process: Chart paper is attached to Acceptable
range manually paper record
Data Mgmt: Periodic manual entry of data Data Mgmt: Second-person verification of
into BPR data entry; quality unit review at release
Humidity data is displayed and Low: Humidity is not a CPP; drug Human Factor: Digital humidity data Medium:
recorded using BMS; a paper substance is not hygroscopic is printed out from HMI, and data is Human Factor: Supervisor review and
manufacturing record is used and is very stable with good flow transcribed to paper record quality unit review at release
properties over a wide humidity GMP Process: Humidity data is captured GMP Process: HMI printout is attached to
6b range Negligible
both manually and electronically paper record; validated BMS with audit
trail and access controls
Data Mgmt: Periodic manual data entry
into paper BPR Data Mgmt: Second-person verification of
printout; quality unit review at release
Humidity data is displayed and Low: Humidity is not a CPP; drug Human Factor: An electronic High:
recorded using BMS; an electronic substance is not hygroscopic manufacturing record mitigates human Human Factor: Use of electronic record
manufacturing record is used and is very stable with good flow error to a residual level minimizes manual entry of data
properties over a wide humidity GMP Process: Humidity data is captured GMP Process: Validated BMS with audit
6c range Negligible
electronically trail and access controls; automated data
capture in electronic record
Data Mgmt: Humidity data is
automatically uploaded into electronic Data Mgmt: Humidity data is managed per
manufacturing record Part 11 requirements
Examples 7–9 demonstrate data integrity risk levels for filter integrity testing with three potential
scenarios.
In Example 7, the data is considered to be of High criticality because the filter integrity testing im-
pacts CQA (sterility) and it is a CPP that confirms the absence of viable microorganisms after aseptic
filling, which is a regulatory expectation (Table 8.3-1).
In Example 7a, the data vulnerability level is Significant, as defined in Section 4.3, due to the high
level of human intervention and because the legacy filter integrity tester has either no ability or a lim-
ited ability to store electronic data; a paper printout is used in lieu of electronic data (Red).
In Example 7b, the data vulnerability becomes Moderate when the potential for human errors is
reduced by using newer technology that features improved functionality to ensure automated data
capture, secure storage, and audit trail capabilities (Orange).
Example 7c shows that the data vulnerability is Acceptable when human intervention is reduced to
a minimum with the filter testing performed in-line, without the operator having to initiate the test
manually, and the test results are automatically transferred to a secure data storage module (Green).
Filter Integrity
Tester
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Aseptic Filling: High: Impacts CQA (sterility) and is a CPP that Human Factor: Manual transcription of set-up data Low:
confirms absence of viable microorganisms by operator
Filter integrity tester with paper print-out Human Factor: Supervisor reviews test results prior to
after aseptic filling
test results GMP Process: Filter integrity tester is manually batch release
7a controlled by the operator Significant
GMP Process: Controlled with written procedures and
Data Mgmt: Legacy filter integrity tester used with training
no/limited ability to store electronic data; paper Data Mgmt: Reliance on peer review of data entries
printout used in lieu of electronic data and test results
Aseptic Filling: High: Impacts CQA (sterility) and is a CPP that Human Factor: Automated data entry and data back- Medium:
confirms absence of viable microorganisms up mitigates human error to a residual level
Filter integrity tester with automated data Human Factor: Supervisor checks test run alert data
after aseptic filling
entry and storage and electronic test results GMP Process: Filter integrity tester is manually prior to batch release
controlled by the operator GMP Process: Test equipment and associated
7b Moderate
Data Mgmt: Latest filter integrity tester is used, has automation interfaces validated, and process
ability to interface with bar code readers to capture robustness monitored regularly
set-up data, has secure data storage module, and can Data Mgmt: Reliance on automation, validation, and
record operator’s actions, including number of test robust monitoring process
runs, electronically and also on the paper printout
Aseptic Filling: High: Impacts CQA (sterility) and is a CPP that Human Factor: End-to-end automation mitigates High:
confirms absence of viable microorganisms human error to a residual level
Filter integrity tester with end-to-end Human Factor: Supervisory review is limited to only
after aseptic filling
automation GMP Process: Filter integrity testing instruments investigation of failed test
integrated into the unit operation
7c GMP Process: In-line test equipment and associated Acceptable
Data Mgmt: Filter is tested in-line without operator interfaces validated, and process robustness
having to initiate the test manually; test results are monitored regularly
automatically transferred to a secure data storage Data Mgmt: Reliance on automation, validation, and
module the robust monitoring process
In Example 8a, the data vulnerability level is Moderate (Orange) due to the high level of human intervention and because the legacy filter integrity tester has either
no, or a limited ability, to store electronic data; a paper printout is used in lieu of electronic data.
In Example 8b, the data vulnerability becomes Acceptable (Green) when human errors are reduced by using newer technology that features improved functionality
to ensure automated data capture, secure storage, and audit trail capabilities.
Example 8c shows that the data vulnerability becomes Negligible (Blue) when human intervention is reduced to a minimum with the filter testing performed in-
line, without the operator having to initiate the test manually, and test results are automatically transferred to a secure data storage module.
Table 8.3-2 Example 8: Medium Criticality – Filter Integrity Testing at the Point of Fill (Pre-Use–Post-Sterilization)
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Levels
Aseptic Filling: Medium: Verifies filter integrity Human Factor: Manual transcription of set-up Medium:
after sterilizing filtration but data by the operator Human Factor: Supervisor reviews test
Filter integrity tester with paper
before aseptic filling results prior initiating aseptic filling process
printout test results GMP Process: Filter integrity tester is manually
8a controlled by the operator GMP Process: Controlled with written Moderate
procedures and training
Data Mgmt: Legacy filter integrity tester used
with no/limited ability to store electronic data; Data Mgmt: Reliance on peer review of
paper printout is used in lieu of electronic data data entries and test results
Aseptic Filling: Medium: Verifies filter integrity Human Factor: Automated data entry and data High:
after sterilizing filtration but back-up mitigates human error to a residual level Human Factor: Supervisor checks test run
Filter integrity tester with
before aseptic filling alert data prior to aseptic filling process
automated data entry and GMP Process: Filter integrity tester is manually
storage and electronic test controlled by the operator GMP Process: Test equipment and
results associated automation interfaces
8b Data Mgmt: Latest filter integrity tester is used Acceptable
validated, and process robustness
that has the ability to interface with bar code
monitored regularly
readers to capture set-up data, secure data
storage module, and record operator’s actions, Data Mgmt: Reliance on automation,
including number of tests runs electronically and validation, and robustness monitoring
also on the paper printout process
Aseptic Filling: Medium: Verifies filter integrity Human Factor: End-to-end automation mitigates High:
Filter integrity tester with end- after sterilizing filtration but human error to a residual level Human Factor: Supervisory review is limited
to-end automation before aseptic filling to only investigation of failed test runs
GMP Process: Process robustness assured by
automation, validation, and monitoring practices; GMP Process: Test equipment and
8c redundant sterilizing filters installed associated automation interfaces validated, Negligible
and process robustness monitored regularly
Data Mgmt: Filter is tested in line without the
operator having to initiate the test manually, and Data Mgmt: Reliance on automation,
test results are automatically transferred to a validation, and robustness of monitoring
secure data storage module process
In Example 9a, the data vulnerability level is Acceptable (Green) due to Low data criticality and the fact that high levels of human intervention with legacy filter
integrity tester are adequately managed through procedural controls.
In Example 9b, the data vulnerability becomes Negligible (Blue) when human errors are reduced by using newer technology that feature improved functionality to
ensure automated data capture, secure storage, and audit trail capabilities.
Example 9c shows that the data vulnerability is reduced significantly and remains Negligible (Blue) when human intervention is reduced to a minimum with the
filter testing performed in-line, without the operator having to initiate the test manually, and the test results are automatically transferred to a secure data storage
module. While this example is theoretically possible to put into practice, it is operationally unlikely at the current time.
Table 8.3-3 Example 9: Low Criticality – Filter Identity Verification at the Point of Fill (Pre-Use Pre-Sterilization)
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Aseptic Filling: Low: Verifies filter identity Human Factor: Manual transcription of set-up data by High:
Filter integrity tester with to support correct filter the operator Human Factor: Supervisory review is limited to only
paper print-out test results installation investigation of failed tests
GMP Process: Filter integrity tester is manually
9a controlled by the operator GMP Process: Controlled with written procedures Acceptable
and training
Data Mgmt: Legacy filter integrity tester used with
no/limited ability to store electronic data; paper Data Mgmt: Reliance on peer review of test results
printout is used in lieu of electronic data
Aseptic Filling: Low: Verifies filter identity Human Factor: Automated data entry and data back- High:
to support correct filter up mitigates human error to a residual level Human Factor: Supervisory review is limited to only
Filter integrity tester with
installation investigation of failed tests
automated data entry and GMP Process: Filter integrity tester is manually
storage and electronic controlled by the operator GMP Process: Test equipment and associated
test results automation interfaces validated, and process
9b Data Mgmt: Latest filter integrity tester is used that Negligible
robustness monitored regularly
has the ability to interface with bar code readers to
capture set-up data, secure data storage module, and Data Mgmt: Reliance on automation, validation, and
record operator’s actions, including number of tests the robustness of monitoring process
runs electronically and on the paper printout
Aseptic Filling: Low: Verifies filter identity Human Factor: End-to-end automation mitigates High:
to support correct filter human error to a residual level Human Factor: Supervisory review is limited to only
Filter integrity tester with
installation investigation of failed tests
end-to-end automation GMP Process: Process robustness assured by
automation, validation, and monitoring practices; GMP Process: Test equipment and associated
9c Negligible
redundant sterilizing filters installed automation interfaces validated, and process
robustness monitored regularly
Data Mgmt: Filter is tested in line without operator
having to initiate the test manually, and test results are Data Mgmt: Reliance on automation, validation, and
automatically transferred to a secure data storage module the robustness of monitoring process
As shown in Table 8.4-1, this example leads to the conclusion that the controls in place are adequate
for planning, line clearance, and in-process control (IPC) testing, while Significant or Moderate vul-
nerabilities exist in dispensing, equipment set-up, packaging operation, and batch reconciliation.
Where the vulnerability is Significant or Moderate, steps should be taken to mitigate the vulner-
abilities. The dispensing process, for example, could be improved by avoiding the use of spreadsheets,
using a barcode/scanning/enterprise resource planning interface to confirm the correct identify of
materials, or adding a real-time double-check that the correct quantity of the materials has been
dispensed. Both the equipment set-up and packaging operation could be improved by increasing
the access controls, so each user has a unique account, or by adding minimum–maximum product-
specific limits to the tolerance ranges for sensors that cannot be modified by the operator.
For the batch reconciliation, using a barcode/scanning/validated electronic system interface could
replace the manual logbook for identifying when a new component is required.
In Process Control
Planning
Testing
Planning Packaging
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Planning: Creation of the process Medium: Ensures unique batch Human Factor: Procedure allows, if High:
order in a validated electronic number assigned, and correct the validated electronic system is Human Factor: Documentation team prints
system (such as SAP). Once master templates are used inaccessible, to print the master template the batch record only after the process
created, the batch-specific data and add a batch number manually, order is created, allowing automatic
in the validated electronic system potentially adding an incorrect batch transfer of batch number
(batch number, maternal number, number GMP Process: The electronic system
10a etc.) is automatically added to the Negligible
GMP Process: Process robustness and associated automatic transfer of
master batch record template and assured by validation batch number to an approved master
printed for use in production manufacturing template is validated
Data Mgmt: Limited risks as process and
transfer of batch specific information to Data Mgmt: Reliance on validation and
the master batch record is controlled by a user access to the validated electronic
validated system system and documentation control system.
Dispensing: A spreadsheet High: Human Factor: Single operator records Low:
is printed from the validated by manual entry into the master Human Factor: Supervisor review only at
Need to ensure correct materials
electronic system process order manufacturing record actual quantities of end of packaging
and batch numbers are selected
and given to the dispensing team dispensed materials
and used; potential for product GMP Process: Controlled with written
detailing what to dispense and mix-up or packaging material GMP Process: Dispensing is a completely procedures and training
what quantity; materials are mix-up, which can impact product manual process, barcodes or labels are
prepared and dispensed, and Data Mgmt: Reliance on converting
quality or patient safety not used; no second format to confirm
10b quantities are manually recorded data in validated electronic system to a Significant
the quantity dispensed (e.g., printout from
directly into a master batch record spreadsheet, which is printed and given to
weigh scales or scales connected to an the dispensing team
upper-level system recording quantities
dispensed)
Data Mgmt: Data in the spreadsheet can
be easily modified before printing, creating
errors during dispensing process
Line Clearance: Performed before High: Human Factor: Reliance on manual High:
dispensed materials are brought confirmation that line clearance was Human Factor: Real-time four-eyes check
If not performed robustly,
to the packaging line; documented performed correctly; chance of both performed by certified peer; quality unit
packaging components or finished
directly into master manufacturing operators missing the difficult-to-clean review at release
product from the previous run may
template; sensors installed in hard- parts of the equipment, but sensors
be mixed into the new packaging GMP Process: Controlled with written
to-clean parts of the equipment installed in these areas reduce the risk
order procedures, training of first operator, and
GMP Process: Sensors in place only in parts certification of a peer performing four-eyes
10c Acceptable
of the line; reliance on human confirmation review
of line clearance in other parts of the line Data Mgmt: Reliance on validation and
Data Mgmt: Limited risks as process is user access to the validated electronic
controlled by validated systems; user system, documentation control system,
access is well controlled so only limited and sensors.
number of trained personnel have access
to change settings
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Equipment Setup: Recipe (for High: Human Factor: Operator could select Low:
CPPs)/ tolerance ranges (balances, Incorrect settings could potentially an incorrect recipe, impacting CPPs, Human Factor: Peer and quality unit
cameras, sensors) and other result in impact to product quality or change tolerance ranges (balances, review of batch report to confirm the
adjustable settings are configured; or patient safety cameras, sensors) outside of required correct recipes and tolerance ranges were
packaging equipment uses group range to ensure higher pass rate (fewer used
accounts for operators, who have rejects); possible to change CPPs outside GMP Process: Controlled with written
access to choose any recipe, and of the validated range
10d procedures and training Significant
edit set-up parameters GMP Process: Set-up is manually Data Mgmt: Different user groups for
controlled by the operator; no double- operator, maintenance, and administrator;
check before starting packaging batch report automatically records set-up
Data Mgmt: Operators use group accounts parameters/recipe used
to make the adjustments and are able to
change set-up parameters
Packaging: Bulk tablets are Medium: Human Factor: Operator can add rejects Low:
packed in blisters, added to the Once the set-up is completed back to packaging process or adjust the Human Factor: Peer and quality unit review
carton together with the patient correctly, potential to impact tolerance ranges during processing to of batch report to confirm the correct
information leaflet, before being product quality is reduced reduce the rejects recipes and tolerance ranges were used
bundled GMP Process: Operator controls the GMP Process: Controlled with written
10e packaging process manually and procedures and training Moderate
can edit the CPP setpoint and sensor Data Mgmt: Different user groups,
tolerance ranges; no double-check duringoperator, maintenance, and administrator;
processing of any edits made audit trail automatically records all
Data Mgmt: Operators use group accounts changes to setpoints or tolerance ranges
to make setting adjustments and includes in the batch report
In-Process Control testing: IPC Medium: Human Factor: Recording can be done Medium:
testing is performed by trained IPC tests are used to adjust during without performing the activity, but Human Factor: Double verification by a peer
personnel; periodicity of IPC the packaging process, but all missing equipment printout will show for each new IPC test; quality unit review
testing is part of batch record, parameters are also part of release testing never took place that a new test was performed according to
with space to record all IPC results testing periodicity defined in the batch record
GMP Process: Test results are transcribed
and attach all equipment printouts into the batch record, and all printouts GMP Process: Controlled with written
are manually added to the batch record; procedures and training; operator adds
10f Acceptable
possible transcription errors equipment printout to the batch record,
overlaying their signature
Data Mgmt: Testing equipment allows
reprinting of previous tests; printout does Data Mgmt: Testing equipment generates
not have the batch-specific information or a printout, with time and date of the
the user’s name; only date, time and result test performed; operators do not have
are captured; equipment does not store access to change the time or date on the
the data electronically equipment—reserved for administrators
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Batch reconciliation: Batch High: Human Factor: Operators can stop the Medium:
yield is calculated, along with If not performed correctly, batch early, or add rejected materials back Human Factor: Double verification by a
reconciliation of unused packaging potential to impact patient safety into the line, impacting the reconciliation peer for each packaging component used,
materials process and quality unit review of reconciliation
process
GMP Process: Batch yield is calculated
using counters (pass/reject information), GMP Process: Controlled with written
that are part of the batch report; procedures and training; operators need
10g reconciliation of packaging components is to complete a logbook each time a new Moderate
completely manual process package component is required, making
a use history of packaging components
Data Mgmt: Operators use logbooks to
available
record the use of packaging components
during the batch; possible to miscalculate Data Mgmt: Equipment automatically
the quantity of packaging materials used generates a report at the end of
packaging; report template and data are
locked and no one can edit the contents
2. European Commission. Annex 1: Manufacture of Sterile Medicinal Products, EudraLex – Volume 4 – Good Manufacturing Practice for Medicinal Products for
Human and Veterinary Use Consultation Document. European Commission: Brussels, 2017.