PDA TR84 Integrating Data Integrity Requirements into Manufacturing & Packaging operations - Share
PDA TR84 Integrating Data Integrity Requirements into Manufacturing & Packaging operations - Share
N
et
Technical Report No. 84
Vi
Integrating Data Integrity Requirements into
in
Manufacturing & Packaging Operations
P
M
G
U
:E
ok
bo
ce
Fa
up
ro
G
Integrating Data Integrity Requirements into Manufacturing & Packaging Operations
Authors
am
Ronald Tetzlaff, PhD, Parexel Consulting
Anthony Warchut, AC Warchut GMP Consulting
N
Contributors
et
Jeffrey Broadfoot, Emergent BioSolutions, Inc.
Vi
Derek Glover, Mylan
in
Maryann Gribbin, Faith & Royale Consultants
Tina Morris, PhD, American Association of Pharmaceutical Scientists
Rebecca Parrilla, U.S. Food and Drug Administration
P
M
Carmelo Rosa, PsyD, U.S. Food and Drug Administration
G
Anil Sawant, PhD, Merck
Christopher Smalley, PhD, ValSource, LLC
E U
o k:
bo
ce
Fa
up
ro
G
am
N
et
Integrating Data Integrity
Vi
Requirements into
in
Manufacturing & Packaging P
Operations
M
G
ISBN: 978-1-945584-19-0
bo
am
TO DATA INTEGRITY ........................................38
3.1 Historical Perspective ...........................................6
3.2 Regulatory Guidance ..........................................10
7.0 REFERENCES ..................................................39
N
3.3 Industry Best Practices .......................................10
et
4.0 QUALITY RISK MANAGEMENT APPLIED 8.0 APPENDIX I: EXAMPLES HOW TO USE THE
TO DATA INTEGRITY ........................................11 9 BOX VULNERABILITY GRID ..........................40
Vi
4.1 Considerations in Assessing Risk ........................11 8.1 API Process Examples .........................................40
4.2 Data Integrity Risk Management Model .............13 8.2 Finished Dosage Form Examples ........................46
in
4.3 Data Vulnerability (9-Box) ..................................15 8.3 Sterility Assurance Examples..............................50
4.4 Using the Data Vulnerability Grid .......................17 8.4 Tablet Packaging Process Examples ....................54
P
8.5 References for Appendix I ...................................57
M
G
FIGURES AND TABLES INDEX
U
Figure 3.1.1-1 FDA Warning Letters–DI Issues Related Figure 5.2-1 Methodology Used to Determine
to Production Activities by Country of Di erentiated Controls ......................23
:E
Production Operations cited by FDA Table 5.3.2-1 Data Integrity Control Grid for
and EU .................................................9 Issuance and Reconciliation of
Fa
Model ................................................14
Recording without a Controlled
Table 4.2.1-1 Classi cation of Data Criticality .........15 Second Format ..................................28
ro
Table 4.2.2-1 Data Control Levels ............................15 Table 5.3.3.2-1 Data Integrity Control Grid for Data
Table 4.3-1 Data Vulnerability Grid (9-Box) – Accuracy when Transcribing Manually
G
am
Table 8.1-1 Example 1: High Criticality API Filter Integrity Testing at the Point of
Reaction Controls Impurity Level .......42 Fill (Pre-Use–Post-Sterilization) ........52
N
Table 8.1-2 Example 2: Medium Criticality – Table 8.3-3 Example 9: Low Criticality –
Drying of API (LOD not a CQA)............43 Filter Identity Veri cation at the Point
et
Table 8.1-3 Example 3: Low Criticality – Small of Fill (Pre-Use Pre-Sterilization) .......53
Molecule API Batch-to-Batch Cleaning Figure 8.4-1 Overview of the Packaging Process ...54
Vi
of Equipment within a Campaign ......45 Table 8.4-1 Example 10: High & Medium
Table 8.2-1 Example 4: High Criticality – Humidity Criticality – Tablet Packaging ............55
in
Monitoring During Sieving of a Drug
Substance..........................................47
Table 8.2-2 Example 5: Medium Criticality – P
M
Humidity Monitoring During Sieving
of a Drug Substance ..........................48
G
U
:E
ok
bo
ce
Fa
up
ro
G
1.0 Introduction and Scope
Data integrity is the cornerstone of establishing and maintaining confidence in the reliability of the
data that assures product quality and patient safety. e reliability of manufacturing production
and control data depends on the procedures, systems, processes, and controls in place to ensure data
integrity. In production, these controls start at the point of data creation and continue through the
entire data lifecycle, including the storage of data and the ability to retrieve it later to support the
quality of the products manufactured.
e term “data integrity” has evolved to denote the degree to which data are complete, consistent,
enduring, and available (ALCOA+) as well as the degree to which the characteristics of the data are
am
maintained for all operations throughout the data lifecycle. Pharmaceutical manufacturers need to
ensure that relevant data are available to document and provide traceability to what occurred, that
the data are attributable to the person who performed the activity and entered the data, and that the
N
data cannot be deleted, omitted, or in any way modified to misrepresent what occurred. All breaches
of data integrity, whether intentional or unintentional, must be investigated.
et
Advances in information technology and in-process monitoring sensors, coupled with lower costs of
Vi
computer memory and better processors, has resulted in explosive growth in electronic data generation,
acquisition, and storage. Applying a one-size-fits-all approach to data integrity controls established on
in
principles developed decades ago may be neither valuable nor, in certain instances, feasible. erefore,
the use of a risk-based approach is essential to developing a robust data integrity program, considering
P
the requirements from a data-lifecycle perspective. Risk-based concepts apply even as industry transitions
from manual documentation systems to fully electronic or hybrid (manual and electronic) systems.
M
is technical report addresses data integrity from the perspective of manufacturing operations. It
G
discusses regulatory trends, risk management concepts, and recommendations for implementing ap-
propriate data integrity controls in manufacturing operations applicable to paper-based, electronic-
U
based and hybrid systems. e case studies included in this technical report provide examples of how
:E
to assess current data integrity risks and implement the concepts presented here.
1.1 Purpose
ok
economical data storage, and superiority of electronic audit trail capabilities over paper records have
compelled the pharmaceutical industry to rethink good manufacturing practices (GMP) controls.
ce
Among the consequences is a heightened awareness of the need to establish and maintain effec-
tive data integrity controls at every stage of the manufacturing process for drug products. While
Fa
the requirement to maintain accurate and complete data is well recognized by industry, what is not
universally understood is the level of data integrity control needed at each step to comply with GMP
regulations. Many companies continue to struggle when deciding what controls are appropriate for
up
each manufacturing operation and what levels of review and verification are necessary to ensure reli-
able manufacturing control data. is technical report describes an approach using quality risk man-
agement (QRM) for establishing and assessing the appropriateness of data integrity controls for each
ro
manufacturing operation based on the criticality and vulnerability of the data for its intended use.
G
Developed by subject matter experts from global industry and regulatory agencies, this technical
report summarizes manufacturing data integrity risks and identifies best practices that can be used to
develop and sustain robust documentation as well as data integrity management procedures, systems,
processes, and controls. Employing these practices will help users achieve compliance with applicable
laws, regulations, and directives for pharmaceutical products such as active pharmaceutical ingredi-
ents (APIs), solid oral dosage forms, sterile injectables, biologics, and vaccines.
Assuring data integrity is an organizational responsibility. Employees at any facility that manufac-
tures, processes, packages, or holds a finished pharmaceutical, intermediate ingredient, or API are
responsible for ensuring that data collected throughout the manufacturing process are accurate and
reliable. Managers, quality assurance personnel, operators, technicians, and support staff alike must
1.2 Scope
e information in this technical report applies to the management of data at pharmaceutical facili-
ties that manufacture, process, package, or hold a finished pharmaceutical, API, or intermediate.
Specifically, it addresses data pertaining to manufacturing operations, materials, facilities and equip-
ment, production, and packaging and labeling, including in-process controls and process analytical
testing. It also applies to the procedures, systems, processes, and controls used at a drug facility to
am
ensure that the drugs conform to applicable laws, regulations, and directives and that the data sup-
port the drug’s identity, strength, quality, and purity.
N
e methods and processes described here can be applied to facilities that manufacture drugs intend-
ed for use both in clinical trials and for commercial distribution. e principles may be extended to
et
facilities engaged in other activities (such as distribution of finished products to customers) or prod-
Vi
ucts (such as components, raw materials, medical devices, and combination products), though these
were not a principal consideration. is technical report is not intended to apply to data integrity
management in clinical practice and the implementation of clinical trials. Data integrity manage-
in
ment in laboratory systems is discussed in PDA Technical Report No. 80: Data Integrity Management
System for Pharmaceutical Laboratories (1), and data integrity management in quality management
systems will be discussed in a future technical report.
P
M
G
U
stamped electronic record that allows for either paper or electronic, without obscuring
reconstruction of the course of events relating or overwriting the original record (4).
ce
record, either paper or electronic, without It should be possible to identify the individual
obscuring or overwriting the original record. or computerized system that performed the
An audit trail facilitates the reconstruction recorded task. e need to document who
of the history of such events relating to the performed the task / function is, in part, to
record regardless of its medium, including the demonstrate that the function was performed
“who, what, when and why” of the action (3). by trained and qualified personnel. is
WHO applies to changes made to records as well:
e audit trail is a form of metadata that corrections, deletions, changes, etc.
contains information associated with actions Legible
that relate to the creation, modification All records must be legible – the information
or deletion of GXP records. An audit trail must be readable in order for it to be of any use.
am
e evidence of actions, events or decisions exception, including deviations that may
should be recorded as they take place. is occur during the process. is includes
N
documentation should serve as an accurate capturing all changes made to data.
attestation of what was done, or what was Enduring
et
decided and why, i.e., what influenced the Records must be kept in a manner such that
decision at that time. they exist for the entire period during which
Vi
Original they might be needed. is means they need
e original record can be described as the to remain intact and accessible as an indelible/
in
first-capture of information, whether recorded durable record throughout the record
on paper (static) or electronically (usually retention period.
dynamic, depending on the complexity of
P
Available
M
the system). Information that is originally Records must be available for review at any
captured in a dynamic state should remain time during the required retention period,
G
available in that state. accessible in a readable format to all applicable
Accurate personnel who are responsible for their
U
Ensuring results and records are accurate is review whether for routine release decisions,
:E
achieved through many elements of a robust investigations, trending, annual reports, audits
pharmaceutical quality system. is can be or inspections (5).
composed of:
ok
policies and procedures to control actions should have established critical process param-
and behaviors, including data review pro- eters that can be monitored or controlled to
Fa
cedures to verify adherence to procedural ensure that the process produces the desired
requirements quality.
deviation management including root
up
understand the importance of following ensure the process produces the desired quality.
established procedures and documenting
their actions and decisions. Critical Quality Attribute (CQA)
A physical, chemical, biological or microbio-
Together, these elements aim to ensure the logical property or characteristic that should be
accuracy of information, including scientific within an appropriate limit, range, or distribu-
data that is used to make critical decisions tion to ensure the desired product quality.
about the quality of products.
Data Integrity
Complete FDA
All information that would be critical to Refers to the completeness, consistency, and
recreating an event is important when accuracy of data. Complete, consistent, and
am
secure manner, so that they are attributable, A flow map that uses a baseline process flow
legible, contemporaneously recorded, original map and overlays the data flow.
N
(or a true copy) and accurate. Assuring data
integrity requires appropriate quality and risk Data Vulnerability
et
management systems, including adherence An indicator of data’s level of exposure to data
to sound scientific principles and good integrity failures due to intrinsic weaknesses in
Vi
documentation practices (3). manufacturing processes, data-capture tech-
nology, and human factors or a combination
WHO
in
thereof.
e degree to which data are complete,
consistent, accurate, trustworthy and reliable Detectability
and that these characteristics of the data are
P
e ability to discover or determine the exis-
M
maintained throughout the data life cycle. e tence, presence, or fact of a hazard (6). For pur-
data should be collected and maintained in a poses of this technical report, a hazard typically
G
secure manner, such that they are attributable, would be a data integrity breach.
legible, contemporaneously recorded, original
U
way, a vast amount of raw data, including meta- Systematic steps taken or in place to reduce or
data, in its native format until it is needed. limit the identified risk.
ro
am
content and meaning of the original or raw A copy of an original recording of data that
data, which includes metadata required to has been verified and certified to confirm it is
reconstruct the CGMP activity and the static an exact and complete copy that preserves the
N
entire content and meaning of the original re-
or dynamic nature of the original records (2).
cord including, in the case of electronic data,
et
all essential metadata and the original record
format as appropriate (4).
Vi
2.1 Abbreviations
in
AHU/HVAC Air Handling Unit/Heating, ISPE International Society for
Ventilation, and Air Conditioning Pharmaceutical Engineering
P
M
ALCOA Attributable, Legible, Contemporaneous, MES Manufacturing Execution System
Original, and Accurate
G
MHRA Medicines and Healthcare products
API Active Pharmaceutical Ingredient Regulatory Agency
U
Committee
NMPA National Medical Products Administra-
APR Annual Product Review tion (China)
ok
FDA U.S. Food and Drug Administration SID&GP Russian State Institute of Drugs and
Good Practices
G
GxP Good practice quality guidelines for TGA Australian erapeutic Goods
various fields Administration
am
human behavior.
e assessment of data integrity procedures, processes, systems, and controls using a consistent,
N
structured approach and forensic techniques was not always a primary focus during FDA inspec-
tions. Periodically, the detection of egregious and unethical events causes FDA to redouble its
et
efforts and refocus its attention on lapses in data integrity. For example, during the late 1980s and
early 1990s, FDA uncovered data integrity violations at a large number of U.S.-based generic drug
Vi
manufacturers (and some innovator companies) in what was later dubbed the “generic drug scandal.”
e FDA discovered widespread falsification of production and control records and found that many
in
abbreviated new drug applications contained falsified data and other serious data integrity violations.
Several individuals from the industry and the FDA were criminally prosecuted, and the scandal
P
prompted FDA to establish in 1990 a pre-approval inspection program (12) that remains in effect
M
today. at program requires successful FDA pre-approval inspections as a condition of approval for
drugs and biologics; subsequently, other major health authorities established similar pre-approval
G
inspection programs. en, in 1997, the FDA promulgated a regulation related to computer system
electronic records and signatures for digital data in 21 CFR Part 11 (13).
U
ese developments impacted industry practices, too. Data integrity became a focus of companies,
:E
industry conferences, and industry best practice documents such as the International Society for
Pharmaceutical Engineering (ISPE) good automated manufacturing practice (GAMP) guides and
ok
PDA technical reports. Data integrity awareness and expertise increased among pharmaceutical
professionals, and the incidence of data integrity problems reported by FDA during its GMP inspec-
bo
tions of domestic and international pharmaceutical manufacturers declined. Egregious data integrity
violations, however, still resulted in sanctions.
ce
Starting in the late 1990s, the advent of the internet and growth in information technology fueled
increased globalization of pharmaceutical supply chains. is rapid expansion roused regulators’
Fa
concerns about data integrity. In 2007, FDA reported that it had found objectionable data integrity
issues at 3 of the 10 companies it had recently inspected on this topic (14). In the years immedi-
ately following, FDA again refocused its attention on data integrity practices. Since then, FDA has
up
included data integrity citations in more than 200 warning letters sent to companies in more than 20
countries.
ro
Since 2012, inspections by the FDA, European Medicines Agency (EMA), Medicines and Health-
G
care products Regulatory Agency (MHRA), World Health Organization (WHO), and other health
authorities have revealed a notable increase in the incidence of data integrity issues at international
manufacturers of APIs and finished pharmaceuticals worldwide. e prevalence and significance of
the data integrity lapses discovered in recent years has led regulators to make data integrity a primary
focus during inspections, and regulators are detecting data integrity lapses with increasing frequency.
Pharmaceutical regulators continue to find conditions and practices that compromise data integrity,
among them human errors, inadequate management oversight, insufficient employee training, sys-
tem failures, inappropriate qualification or configuration of systems, poor procedures or not follow-
ing procedures, and intentional acts of falsification.
am
FDA has cited in recent warning letters questionable data integrity practices in manufacturing opera-
tions. For example, a warning letter issued in July 2019 cited that batch records included handwrit-
N
ten values routinely within process parameters while the values recorded by the programmable logic
et
controller (PLC) of the compression machine were frequently outside of the established process
parameters (15). A second example issued in December 2019 cited multiple discrepancies between
Vi
the human machine interface (HMI) data and the entries made by operators into batch records (16).
in
P
M
G
EU
WLs with Production Related Data Integrity Citations by Country of Inspected Location
(22 Feb 2012 – 15 Aug 2019)
:
ok
bo
ce
Fa
u p
ro
G
Figure 3.1.1-1 FDA Warning Letters–DI Issues Related to Production Activities by Country of Inspected Location
n API (35)
n FDF (58)
n Both (3)
am
N
et
Figure 3.1.1-2 Incidence of Warning Letters Citing Production Activities by Product Type (API vs Finished
Vi
Pharmaceuticals)
in
3.1.2 EU Non-Compliance Reports (NCR)
P
A review of approximately 163 EU non-compliance reports (NCRs) for the period between January
M
2012 and August 2018 found that 82 of the 163 NCRs (50%) include citations of data integrity-
related observations (Figure 3.1.2-1). Review of the 82 NCRs that include data integrity citations
G
found that approximately 46% involved some sort of falsification issues in the production system,
22% included falsification issues in the laboratory system, and 32% of the NCRs included falsifica-
U
e data in Table 3.1.3-1 is based on observations by FDA and EU inspectors during their inspections
of quality management systems for production operations between February 2012 and August 2019.
Table 3.1.3-1 Examples of Data Integrity Issues in Production Operations cited by FDA and EU, 2012–2019
am
Examples of Data Integrity Issues Cited in Production Operations Category
Falsification/Fabrication of Records (e.g., visual inspection data, cleaning validation False Entries
N
records, batch records, annual product reviews, glove integrity testing)
et
False or Misleading Statements (Made to FDA investigators about the practice of False Statements
blending failing and passing API batches)
Vi
Delaying, Denying, Limiting or Refusing Inspection (Failure to allow access Delay/Deny/Limit/
to facilities, records, documents, or responsible individuals that interferes with Refuse Inspection
in
investigator’s ability to complete an inspection, such as hiding manufacturing-related
records or dumping unlabeled vials when contents were questioned by FDA)
P
Computer Access Controls (Unprotected spreadsheets for performing calculations and Computer Access
Controls
M
statistical evaluations of production data; shared login credentials that did not permit
identification of a specific person using the shared login to link specific actions to a specific
G
operator; password shared by multiple individuals to gain access to each individual piece
of equipment and access level; inadequate controls to assure that changes in master
production and control records or other records are instituted only by authorized personnel)
U
Lack of Contemporaneous Data Entries (Pre-dated and backdated batch records; Contemporaneous
E
batch records signed by persons who had not performed the activities; uncontrolled Data Entries
spreadsheets used to record in-process quality data that were then transcribed and
k:
backdated in the production record; cleaning records that did not match video recordings;
o
batch production data entered on “mock sheets” for later transcription to official records
bo
dirty; batch records missing critical information and signatures; erroneous calculations and Discrepancies
use of incorrect amounts of starting materials during production; inaccurate quantities for
quality defects; AHU/HVAC filter-cleaning records with discrepant entries)
Fa
Batch Data Traceability (Programmable logic controllers (PLCs) and manufacturing Data Traceability
equipment with shared login credentials that did not permit identification of a specific
up
cleaning records, sanitization records, annual product reviews, change controls, Destroyed/
filter integrity testing, media fill data, sterilizer data, in-process weight checks, yield Unavailable
G
am
operation Scheme (PIC/S), China National Medical Products Administration (NMPA), Australian
erapeutic Goods Administration (TGA), and Russian State Institute of Drugs and Good Practices
(SID&GP) have all released documents that provide the current thinking of regulators related to
N
data integrity concepts and expectations including the following:
MHRA: GxP Data Integrity Guidance and Definitions (March 2018) (3)
et
FDA: Data Integrity and Compliance with Drug CGMP Questions and Answers Guidance for
Vi
Industry, (December 2018) (2)
WHO: Guidance on Good Data and Record Management Practices, WHO Technical Report Series,
in
No. 996, Annex 5 (2016) (4)
WHO: Good Chromatography Practices, Draft for Comments (July 2019) (18)
P
PIC/S: Draft PIC/S Guidance: Good Practices for Data Management and Integrity in Regulated
M
GMP/GDP Environments (November 2018) (5)
G
NMPA: Draft Drug Data Management Standard (Oct 2016) (19)
TGA: Data Management and Data Integrity (DMDI) webpage (April 2017) (20)
U
SID&GP: Guidelines: Data Integrity and Validation of Computerized Systems (August 2018) (21)
:E
Several associations, including PDA, ISPE, and the Active Pharmaceutical Ingredients Committee
(APIC), also have issued best practice documents related to data integrity:
bo
PDA: Elements of a Code of Conduct for Data Integrity (February 2016) (22)
ce
PDA: Technical Report No. 80: Data Integrity Management System for Pharmaceutical Laboratories
(August 2018) (1)
Fa
APIC: Practical Risk-Based Guide for Managing Data Integrity (March 2019) (25)
G
Risk should be controlled regardless of whether the data are managed using a manual system or an
automated electronic system. Any potential breach related to data integrity—particularly the loss,
omission, deletion, or alteration of data—must be investigated, addressed, and accounted for in the
quality system and, as necessary, reflected in the risk management documentation. Given that data
may be managed and/or transferred between different formats or systems, efforts should be made to
am
mitigate the risk to the integrity of the data during these transfers. erefore, judicious application
of these concepts and targeted application of data integrity principles is warranted as the industry
transitions from paper to hybrid (manual and electronic) and, eventually, to fully electronic systems.
N
Data integrity risk also should be addressed during the transfer of data between electronic systems.
et
e PDA Technical Report No. 54 series describes QRM more broadly, including how to conduct
an appropriate risk assessment and formulate a mitigation strategy to reduce the risks identified. For
Vi
more information on QRM generally and other important elements, such as risk communication,
please see the TR 54 series and ICH Quality Guideline Q9: Quality Risk Management (8,26).
in
4.1 Considerations in Assessing Risk
P
Risk assessments should be conducted proactively at the time any new data management system is
M
implemented and should be captured by the overarching change control. At this stage in the process,
risk assessments can serve as preventive measures to ensure that data integrity is maintained through-
G
out the system lifecycle. When performed on existing data management systems, the assessments
should identify potential data vulnerabilities based on the existing level of controls and the criticality
U
Organizations also should consider performing a risk assessment after a deviation or incident to
identify potential gaps in the data management system. Assessment of data integrity risks also should
ok
be done in conjunction with the change management process of a data management system to avoid
creating new or additional risks. is will enable informed decision-making regarding the extent
bo
of the issue and the potential impact. Key objectives of this assessment are to determine if the root
cause is related to a gap in data management and, if so, to identify the extent of the gap and the
ce
Data integrity controls should be considered from both the behavioral and technical perspectives.
Fa
From a behavioral perspective, the overall quality culture, including communications and training, is
intimately related to the effectiveness of the data integrity program.
up
ro
G
am
Error caused by gaps in rules stating Violations caused by malicious intent to
Thinking Errors
what tasks should be performed and by perform a fraudulent act, e.g., falsifying
whom, e.g., lack of or inadequate stan- data for personal gain or avoid personal
dard operating procedures (SOPs) pain
N
Knowledge Gap
et
Error caused by knowledge gaps in how
to perform a task, e.g., lack of or inad-
Vi
equate training
in
Attention Failure Misconduct
Error caused by taking the wrong action, Violations caused by knowingly ignoring
e.g., unfocused state of mind or a fre-
P
procedures or controls due to misplaced
Action Errors
quently performed action goes wrong priority, e.g., ignoring established controls
M
or multitasking or aggressive deadlines to compensate for aggressive target or
time pressure
Memory Failure
G
One category is the unintentional act. Risk may be unintentionally introduced in a variety of ways.
k:
e unintentional act may take the form of an “action error” caused by a slip in attention or lapse
of memory, or a “thinking error” caused by gaps in governing procedures (knowing what or who) or
o
knowledge (knowing how). For instance, numbers might be reversed in an entry or entered in the
bo
wrong spot due to a slip in attention. A bar code might have been applied incorrectly to materials
that are brought into the processing room, and the operation started before operators realize that the
ce
materials were for a different batch. Another example could be that one operator sets up the equip-
ment believing it will be used for one product, but a different product is brought in and processed.
Fa
should identify the root cause, determine the impact and criticality of the deviation, and assign ap-
propriate CAPAs. Where human error is suspected or identified as the cause, this should be justified,
ro
having taken care to ensure that process, procedural, or system-based errors or problems have not
been overlooked, if present.
G
e other category is the intentional act. Risk may be introduced intentionally through misconduct,
by failing to follow established procedures and controls due to misplaced priorities or through mali-
cious acts. For example, to personally benefit or to avoid personal consequences associated with hav-
ing made the mistake, an employee may cover up a mistake by creating false entries showing that the
desired activity or result occurred. In one misconduct scenario, the operator was not paying attention
to the mixing or blending operation and allowed it to run for longer than the validated time, but
entered a value within the required range. Another example could be an overworked operator who is
struggling to keep up with the manufacturing schedule and, in an effort to catch up, reprints the last
acceptable in-process test result with the batch number of the new sample. Determining the reason
for an intentional or malicious act may not always be possible.
In the case of unintentional risks to data integrity, it will suffice to follow the process described for
risk assessment in this section. Provided the “right” people are in the room—whether operators,
am
technicians, mechanics, programmers, and/or supervisors—the risk assessment is likely to identify
the areas in which an accident in data capture and recording is possible. e purpose of the risk as-
sessment is to determine the probability of that accidental error occurring.
N
When the risk to data integrity is the result of an intentional action, a comprehensive investigation
et
is needed. Intentional manipulation of data is hard to detect and may require the use of forensic
detection techniques. e risk assessment process may uncover some of the circumstances in which
Vi
deliberate acts have occurred, but solid assurances that no repercussions will occur for contributing
such information during a risk assessment requires a great deal of trust.
in
4.1.2 Risk Control Strategies
P
e potential impact on data must be determined and documented and, following the risk assess-
M
ment, mitigations must be determined to reduce risk. For risks that have been identified as having a
high probability of occurrence and a high impact on product quality or patient safety, a balanced risk
G
control strategy will need to be implemented. Understanding the manufacturing process and data
lifecycle processes is crucial to developing successful risk control strategies.
U
Although automation is not the solution to every problem, it can be a useful tool as part of a risk
:E
control strategy. Intelligent barcodes can be used to confirm product and component identity and
can prevent a process from proceeding until the error is rechecked and corrected. e time-honored
ok
use of a second person checking or a supervisor reviewing is also important, provided that it is time-
ly. Frequently, the supervisory review takes place too long after an event to confirm that it occurred
bo
as recorded. Moving further into the digital age, electronic automation will replace such reviews
when implemented and qualified appropriately.
ce
All data, whether recorded electronically or on paper, requires a risk management approach to deter-
mine what controls are important to assure data integrity.
up
In order to estimate the level of exposure to potential data integrity issues, the following elements
must be considered, as illustrated in Figure 4.2-1:
ro
Data criticality
G
us, data criticality and current level of data controls, taken together, can aid in identifying the risk
of data vulnerability. Data vulnerability can indicate the level of exposure to data integrity issues due
to intrinsic weaknesses in the control of three key elements: data capture technology (data manage-
ment), human factors, manufacturing processes, or a combination of all three.
am
All data related to manufacturing is critical as it is necessary to allow for the reconstruction of actual
events; however, the criticality level may differ due to the data’s use and the controls under which it
was generated. Data that is related to product quality (or supports a disposition decision) and patient
N
safety would generally be classified as High criticality. Critical quality attributes (CQAs) and criti-
cal process parameters (CPPs) would be classified as High criticality. Data that is related to process
et
robustness and consistency would be considered Medium criticality. Process robustness is discussed
Vi
in detail in PDA Technical Report No. 54 and the other technical reports in the TR 54 series (26).
Data that is related to neither would be considered general GMP and of Low criticality.
in
Data criticality is a measure of the significance of a data point or data set in supporting the safety,
quality, identity, purity, potency, and/or effectiveness of the product being manufactured. e
P
manufacturer can best determine the criticality of the data generated by its own processes since it
M
knows the quality target product profile, CQAs, critical processes, and CPPs for the product being
manufactured (27). is concept also applies to clinical trial manufacturing. While CPPs and CQAs
G
may not be fully established in that context, initial information should be available to determine the
critical aspects of the manufacturing processes and existing controls.
: EU
ok
bo
Determine
Data Criticality
ce
(Section 4.2.1)
Fa
1
While this technical report uses high/medium/low risk ranking levels, organizations may use dif-
ferent risk ranking levels and adjust their approach accordingly. Users would need to be mindful
not to compromise the intent of this framework.
When the intended use of the data directly impacts product quality and/or product safety:
Product quality monitoring and control of processes that may be responsible for causing
variability during manufacturing, release, or distribution impacting critical quality attributes,
High critical material attributes, critical process parameters, or critical process controls, including
those that may be linked with the product registration dossier (where applicable)
am
Product safety monitoring and control of processes that ensure effective management of
field alerts, recalls, complaints, or adverse events
When the intended use of the data relates to quality attributes, material attributes, process
N
parameters, key process parameters, or process controls that are not CQAs/Critical Processes
Medium (CPs)/CPPs and may or may not be in the product registration dossier; this includes parameters of
et
the manufacturing process that “may not be directly linked to critical product quality attributes but
need to be tightly controlled to assure process consistency” (28)
Vi
When the intended use of the data is to provide evidence of GMP compliance relating to
Low
monitoring and control of processes that do not fall into the High or Medium category
in
4.2.2 Data Control Levels
P
An assessment must be performed to determine the existing level of data controls (Table 4.2.2-1). High-
M
level data controls are those associated with validated automated processes and minimal human intervention
throughout the data lifecycle, whereas low-level data controls are those that involve manual processes with
G
a high degree of human intervention throughout the data lifecycle. Section 5.0 provides some examples of
high-, medium-, and low-level data controls that may be appropriate for the criticality of the activity.
U
Table 4.2.2-1
E
High degree of validated process automation; electronic data lifecycle (e.g., capture, analysis,
High
o
Manual data lifecycle (e.g., capture, transcription, second-person witnessing); manual process
Low
measurements and testing; manual processes with a high degree of human intervention
Fa
weaknesses in three key elements: data management technology, human factors, GMP manufactur-
ing processes, or a combination thereof.
ro
To determine the vulnerability of the data being collected, the user first must assess the level of data
G
criticality and the existing level of data control. Each will be assessed as High, Medium, or Low, us-
ing the definitions in Tables 4.2.1-1 and 4.2.2-1.
Subsequently, a 9-Box grid can be created by mapping the data criticality level with the data control
level. e 9-Box grid helps visualize how data criticality and data controls interact to rank levels of
risk. e criticality of the data will determine the row of the vulnerability grid, while the level of
control associated with the data will dictate the appropriate column.2
2
While this grid is based on the high/medium/low risk ranking levels, organizations may adapt it
to reflect local needs and to use different risk ranking criteria for data criticality and level of data
controls. e resulting grading of data vulnerability would likewise need to be adapted.
am
e objective is to have the right level of controls in place based on the criticality of the data, and
avoid being in the red or orange boxes. e goal of mitigation strategies, then, is to move from right
N
to left by increasing the level of data control. In this assessment, all data relevant to GMP is critical,
even if it relates to routine operations.
et
e 9-Box grid in Table 4.3-1 provides a visual depiction of the vulnerability of data given the
Vi
current level of data management controls and the criticality of the data. e 9-Box grid does not
provide any opportunity or cover for falsification of data.
in
Table 4.3-1 depicts only the data controls relating to data management technology. For a holistic assess-
P
ment of vulnerability, the data controls relating to human factors and the GMP process should be similarly
assessed. Appendix 1 provides further instructions for using the 9-Box grid, with illustrative examples.
M
Table 4.3-1 Data Vulnerability Grid (9-Box) – Example for Data Management Technology Controls
G
up
am
tion to analysis tools, and archiving. e 9-Box can also be used before implementing new or revised
processes. e outcome of the assessments will be only as good as the completeness and objectivity
of the internal assessments. Underestimating risk factors and overestimating current data integrity
N
controls will diminish the effectiveness of the outcome.
et
e 9-Box also allows users to identify controls that could be implemented to decrease the risk to
data in the manufacturing process. Implementation of those controls could decrease data vulnerabil-
Vi
ity from a high level to a lower level (from right to left in the grid). Examples for implementing the
9-Box in specific scenarios are included in Appendix I.
in
is technical report recommends, as the first step, selecting a product and identifying the data that
is critical for that product to meet its approved specifications and its intended use (see Section 4.2.1.)
P
e critical data for the product should have been identified already as part of the product develop-
M
ment process and included in corresponding documentation. After identifying the data that impacts
the product CQAs and the processes that are critical in its manufacture, the user can prioritize internal
G
assessments, identify risk factors, and categorize the data vulnerability for these critical processes.
U
Because three main elements impact data vulnerability—data management technology, human fac-
:E
tors, and GMP process—the next step is to consider these three categories when assessing the specific
data management process for the operation being evaluated. After completing this analysis for a
specific process or unit operation, the completed information may be leveraged for the same process
ok
or unit operation used for another product, giving due consideration to potential differences.
bo
For data management, the full lifecycle of the data needs to be considered to determine areas of
vulnerability. is includes how the data is captured (manual/automated/hybrid system), frequency
ce
of collection, validation of any automated systems used to capture the data, migration of produc-
tion data to data analysis tools, data transcription, and data archiving. Here, again, existing data flow
Fa
capture and review are considered inherently more vulnerable than those that are automated. To deter-
mine the reliability of the current processes, the assessment should consider trends related to data tran-
ro
scription errors and poor application of good documentation practices. Conversely, automated data
collection, properly validated, will be less prone to error and more likely to yield high quality data.
G
For GMP process, the complexity and robustness of the manufacturing process being evaluated
should be considered to determine areas of data vulnerability. More complex processes are likely to
produce a larger volume of data, which could increase the number of factors that impact the data
vulnerability. e assessment needs to look at all aspects of the manufacturing process and evaluate
each process on a case-by-case basis. A newer manufacturing process that has been validated recently
and that lacks a history of robustness will have greater data vulnerability than an established process
with a high-process capability. Trends related to change control, out of specification (OOS) results,
alarms, and out-of-calibration instances can also illustrate the vulnerability of data being collected
for the manufacturing process. Once identified, the trends should be investigated to determine their
relationship to data vulnerability.
am
Hybrid systems (e.g., discrepancies between paper printout and corresponding
data lifecycle) electronic record, data duplication)
New or complex manufacturing technology (causing repeat errors)
N
Overwriting existing electronic data
et
System validation age
System controls related to data export, calculation, reporting, transfer, archiving
Vi
and retrieval
Appropriate access level
in
Technologies with inadequate data integrity elements such as unique access, audit
trail, data backup & restore, electronic data review, date & time stamp, among
others (e.g., spreadsheets, LIMS)
P
M
Unexplained discrepancies between electronic raw data and reported data
G
Repeat human errors (e.g., due to multitasking, high personnel turnover, inadequate
training, time pressures)
k:
GMP Process Aborted runs (e.g., due to lack of planning, understanding system operation,
suitability of equipment and process)
Fa
Complex process (e.g., many interfaces, high level of human intervention, high
levels of manual data entry)
up
Once the risk factors have been identified, the third step is to assess the existing controls as described
in Section 4.2.2. e controls in place should have been identified, for instance, by using a data flow
map as discussed in Section 4.5. A historical review of the information related to data integrity issues
contained in the quality management system (QMS) can be valuable in determining the adequacy of
the existing data integrity controls.
am
eration fits into one of the green boxes (H/H, M/M, L/L), the controls in place are sufficient for the
criticality of the data and identified risk factors. If the process fits into one of the blue boxes (M/H,
L/H, L/M, L/L), the controls in place may be more than required for the criticality of the data and
N
the minimal risk factors identified. is does not imply that the controls for these processes should
be decreased, but that resources potentially could be shifted to other processes, such as those in the
et
orange and red boxes.
Vi
4.5 Data Process Flow Maps
in
Data process flow maps, a GMP requirement in certain parts of the world (29), are an extremely
useful tool for visualizing the movement of data through a given operation and can facilitate the
identification of areas of high risk to the data. Every step from start to end of a given process (e.g.,
P
from order creation to release of finished product) should be identified and documented in order to
M
obtain a complete picture of the data flow in the process. For each step, the data process flow map
G
should identify the activity performed, how the activity is performed, the systems used to perform
the activity, the data created, the method of recording the activity, and the decisions made from the
data. It also should include details on how data flows between systems or process steps and the means
U
used to transfer data, taking the data lifecycle into consideration as part of the supporting documen-
:E
tation. e data process flow map should start with the point of origin of the data and include the
systems (e.g., equipment, devices, applications) used in the process. Documenting the data flows
ok
related to manufacturing processes and systems through personal observation, such as through a
Gemba walk, can prove helpful in this process.
bo
Points in the process where data integrity could be compromised should be identified. Asking a series
of critical-thinking questions at each stage of the data flow throughout the lifecycle can facilitate the
ce
Is an audit trail available that demonstrates if and when data is manipulated in any way, and
does it identify who has performed the operation?
up
Is data held in temporary memory? What happens if there is no power to this unit?
ro
For instance, unauthorized access to a system could potentially allow for manipulation of batch-re-
lated data. ese points where data integrity could be compromised represent areas of risk. For each,
G
a risk assessment should be conducted, and mitigations to reduce risk identified and implemented.
Differences in detectability (including elements related to electronic vs. manual processes) should be
considered as part of the risk assessment.
1. Identify the key stakeholders involved in the process and hold a brief interview to obtain an
overview of the process.
2. Create a process data flow map to determine:
If the data is captured in paper or electronic format,
am
If the data flow takes place in real-time (solid line) or later (dashed lines),
If automated printed outputs are available with out-of-validated-range alert, and
N
If procedural controls are already in place or in use (red ovals in Figure 4.5.1-2).
et
3. the process/data flow is graphed, areas of data integrity vulnerabilities can be highlighted
and used as input to build a remediation plan.
Vi
Case 1 (Figure 4.5.1-1) depicts a paper-based, manually controlled granulation process with an end-
in
point of total granulation time that has been derived from process validation work, included in the
master batch record, and approved by the quality unit. Case 1 relies on the data being captured by
P
the operator, with second-operator verification, by wet ink recordings directly into the paper batch
production record in real time. In this example, the endpoint of the granulation operation is a CPP
M
that impacts the CQA of the finished product and would therefore be considered High-criticality
G
data. e existing level of controls would be considered Low because of the total manual operation
(see Section 4.2.2.). e only controls in this case are the second-person verification and a review by
U
In Case 2 (Figure 4.5.1-2), the same validated granulation process has been automated. e total
granulation time has been included in a recipe, approved by the quality unit, and programmed into
a PLC that controls the process. is programming includes an automated alert when the validated
ok
range is exceeded. e control level is now considered High because the validated PLC automated-
data collection system does not depend on any human intervention. With this level of control and
bo
the PLC’s printout of the granulation process, which includes the data on the total granulation time
and an out-of-range alert, second-person verification is not necessary. e automated data collection
ce
further protects the integrity of this High-criticality data by preventing changes to the total granula-
tion-time data in the system. e full batch production record is still reviewed by the quality unit as
Fa
Automated data capture of the granulation time, printouts of the data, and automated alerts when
up
the total granulation time exceeds the validated range can reduce the procedural burden as well as
improve data integrity assurance. is becomes obvious when comparing the data flow of Case 1 to
ro
that of Case 2. In Case 2, where the PLC controls the process, captures and stores the data, and gen-
erates a printout with automated alerts that is attached to the paper batch production record, there is
G
Production
Operator 1 Confirm Initiate Deviation
Components
No
Enter Start & End Total time Yes Review BPR for
Times within VAL total granulation
am
(start granulation) range? time
N
Peer Review
Operator 2 MBR – Master Batch Record
After the fact
Real-time
et
Figure 4.5.1-1 Case 1: Granulation Operation (Paper-Based Manual Process without Automated Alerts)
Vi
Quality Unit Approve Validated Confirm process Perform
Product Yes Reject
in
Process parameters in MBR Deviation
Impact? Batch
(CPP / CQA) (PLC recipe) Investigation
Fail
No
Full BPR Pass Release
Review Batch
P
M
Production
Initiate Deviation
Operator 1 Confirm
Components
G
Yes
Granulation granulation time out of VAL range BPR – Batch Production Record
alert MBR – Master Batch Record
After the fact
:E
Real-time
Figure 4.5.1-2 Case 2: Granulation Operation (PLC-Controlled Process with Automated Alerts)
ok
bo
ce
Many events during manufacturing may compromise the integrity of the data. Any number of fac-
tors may cause these events, including human error, insufficient training, system failure, inappropri-
p
ate qualification or configuration of systems, poor procedures, or people not following procedures, as
u
PDA recognizes that quality culture is critically important in the prevention and detection of data
G
integrity breaches; however, this technical report is designed to identify risk and potential mitigation
of a data integrity breach regardless of where on the spectrum of data integrity (system error, indi-
vidual error, individual or institutional malfeasance) that breach occurs. Of course, intentional fraud
and intentional manipulation of data are unacceptable under any circumstances, regardless of data
criticality, vulnerability, or existing level of controls.
In supporting and communicating expectations around a culture of quality for data integrity, PDA
recommends that entities use the PDA publication, Elements of a Code of Conduct for Data Integrity
in the Pharmaceutical Industry (22, 21), which outlines key elements necessary to help ensure the
reliability and integrity of data throughout all aspects of a product’s lifecycle. It can be used, in whole
or in part, as a control approach to guide a company’s internal practices, to create or modify an
am
FDA, MHRA, and PIC/S have all released documents to reeducate the industry on current data integ-
rity concepts and expectations. ese are the “what”—what is required to ensure the integrity of data.
N
Data integrity controls can be embedded in working procedures and practices or included as part
of computer system validation and equipment qualification or usage. ese controls may be used to
et
minimize the potential that a data integrity issue will occur and to increase the probability of detect-
ing an issue that does occur. e data integrity controls are the “how”—how to ensure that these
Vi
requirements are implemented in practice.
Regulations convey the core elements that must be in place to ensure data integrity and a robust
in
QMS, including some of the following examples:
P
Standard operating procedures (SOPs) are in place for completion and retention of records.
M
People are suitably trained for their intended job purpose.
Systems are validated or qualified for their intended purpose.
G
altered, permanently linked to the document, and have a time and date stamp.
Data integrity risks should be identified as part of the QRM process and dealt with accordingly.
ok
A management review process allows data integrity issues to be elevated to the highest levels of
the organization.
bo
e data governance policy is endorsed at the highest levels of the organization, empowering a
strong data integrity culture at all levels of the organization.
ce
While there are many requirements to ensure a robust QMS, Section 5.0 only focuses on data
integrity controls where it is acceptable to differentiate the level of controls that may be applied. is
means that the levels of controls applied (e.g., frequency of review, number of items to check, who
performs the review) may differ depending on the criticality of the data, system, or activity. Consis-
tent with ICH Q9, which promotes a risk-based approach, highly critical data, systems, or activities
require more controls, while less critical data or activities require fewer controls.
is portion of the technical report focuses on only a small part of all data integrity controls required
by regulations. Most data integrity controls required by health authority regulations apply in the
same manner for any GMP data, that is, the controls do not differ based on the criticality of the
processes performed by the organization. Such controls are not discussed further in this technical
am
es, the individuals involved should make the necessary correction, explain why the correction was
made, and sign with the time the correction was made, leaving the original entry still legible.
N
GMP regulations also require that, in electronic systems, a user’s privileges should be limited to allow
the employee to perform the work, without any additional unnecessary privileges. Access to change
et
the date or time of the system, for example, should be limited to a very small group of people, ideal-
ly independent administrators. Regardless of whether the system is highly critical (e.g., granulation)
Vi
or less critical (e.g., secondary packaging systems that do not store data relating to quality attributes
or parameters such as expiry dates), the same controls should be in place to restrict access.
in
Any given system may have some controls for which it is possible to differentiate the level of controls
applied, and some controls that will be the same regardless of the criticality of the activity.
P
M
5.2 Methodology Used to Determine Di erentiated Controls
G
To determine the controls required for data, records, or systems, existing regulations and guidelines
for data integrity were reviewed (2-5,22,27). e recommended controls were categorized against
EU
the type of document control system being employed: paper-based, electronic-based, and a hybrid.
Some of the controls identified apply to paper-only systems and some apply to electronic-only sys-
tems. Hybrid systems incorporated many of the controls for paper- and electronic-based systems.
:
ok
bo
ce
Data Integrity
Fa
Requirements
up
ro
G
e authors identified seven categories of controls for which it is acceptable to differentiate the level
of control applied, as shown in Table 5.2-1. Each category was assessed to determine whether the
data, record, or system was being controlled and then was further scrutinized to determine which
am
controls within that category may be differentiated. For example, based on criticality, the technical
report team considered whether it is acceptable to differentiate between the frequency of control
(daily vs. monthly), who performs the control (manufacturing vs. quality unit), or what needs to be
N
controlled (more-detailed vs. less-detailed review).
et
Where differentiation is acceptable, details of how the controls differ based on criticality are dis-
cussed in detail in Section 5.3. e level of controls required may be applied at the data, record,
Vi
or system level, depending on which of the seven categories are being discussed. Any requirements
within a category that remain consistent across the criticality levels are considered basic elements in
in
ensuring a robust QMS and are out of scope for this technical report.
Table 5.2-1 Categories Where Di erent Levels of Data Integrity Controls are Acceptable based on Criticality
P
M
Type of
Category of Control Short Description of Category
System
G
Controls associated with storage of and access to
Storage of and access to
1 completed / archived paper records, and who has
completed/archived paper records
U
Paper
4
systems access to electronic records and systems is applied.
Discusses taking a risk-based approach to identify which
Fa
Sections 5.3.1–5.3.7 have been developed with manufacturing operations in mind; however, many
of the concepts discussed can also be applied to other functional areas (e.g., analytical laboratories).
am
Some of the differentiated data integrity controls should be applied at the system level, others relate to
records, and some are relevant to data points. Each table provides the necessary details concerning detec-
N
tion vs. prevention and the level where the control should be applied (data point, record, or system).
ose sections provide examples of critical activities designated as High, Medium, and Low for il-
et
lustrative purposes and are not prescriptive; the examples are intended to stimulate thinking. Each
Vi
entity should assess its own data and apply the concepts in this technical report to determine the
controls necessary for their operations. In addition, suggestions are provided throughout, illustrative
only, regarding the frequencies at which certain activities should take place. Organizations should
in
determine the frequencies that are most applicable to their operations using a risk-based approach to
ensure that the periodicity of activities is commensurate with the risks identified.
P
M
Sections 4.3 and 4.4 discuss that, in the data vulnerability model, the risks associated with data
management, human factors, and GMP process must be identified, and the controls in place must
G
be assessed to confirm they are sufficient for the risks identified. Sections 5.3.1–5.3.7 primarily
discuss data management controls.
U
e controls discussed below would result in data being placed in one of the green boxes (H/H;
:E
M/M; L/L) of the 9-Box grid (Table 4.3-1). at is, the controls would be adequate for the critical-
ity of the data and the identified risk factors. If fewer controls are in place than those recommended
ok
in Sections 5.3.1–5.3.7, the level of data management controls is not adequate for that level of data
criticality. On the other hand, if more controls than required are in place, the data vulnerability in
bo
e controls discussed in Sections 5.3.1–5.3.7 are only one part of the controls that need to be
ce
considered to assess the overall data vulnerability. Organizations also must consider the data manage-
ment controls, human factor controls, and GMP process controls that apply without any possible
Fa
differentiation in level of control. Whether the levels of control required are differentiated or not, all
controls must be described in SOPs.
up
or archived paper records. ese controls are applied at the record or system level and are meant to
G
All documents should be stored in a medium that will not degrade the readability of the records
upon storage. Documents of High and Medium criticality, including batch records and validation
protocols, should be stored in a climate-controlled environment with a card key or restricted access
and a logbook recording the removal and return of records. An annual review of who has access to
the room is recommended.
For documents with Low criticality, such as completed user-access authorization forms where evi-
dence of training exists in another system, storage in an office retention location, such as a local file
cabinet with limited access to the file cabinet keys, would suffice. Who has access to the keys should
be reviewed on a bi-annual basis.
am
record record
Card Key access or Card Key access or
limited key access with limited key access with
Access Control Limited key access
N
entry documented in entry documented in
logbook logbook
et
Periodic User Access
Annually Annually Every 2 years
Review
Vi
5.3.2 Generation and Reconciliation of Documents
in
Table 5.3.2-1 identifies several prevention controls related to the issuance and reconciliation of
paper records. e controls described are applied at the record or system level.
P
M
Highly critical records, such as batch records, should be issued by a designated unit and have a
unique identifier. e printing of these records should be controlled, i.e., a specified number of
G
people authorized by the quality unit to print, with full traceability of the number of copies printed,
when they were printed, and by whom. Full reconciliation of the record, including extra sheets or
U
additional pages, is required after use. Destruction procedures for unexecuted (blank) forms should
be in place and may be performed by the issuing unit.
E
For Medium-critical documents, like the form used for calibration, a unique identifier for the record
k:
is not required. Controlled printing should ensure that the templates are printed by a limited num-
o
ber of people authorized by the quality unit. While an unlimited number of copies of the template
bo
is allowed, a process should be in place to avoid misuse, including use of expired or outdated forms.
Full reconciliation of records and pages based on the quantity issued is warranted. Destruction
procedures for unexecuted (blank) forms should be in place and may be performed by the operating
ce
Low-critical documents, for example, the blank forms used for recording training activities, may be
printed by the end-user without a unique identifier, and no reconciliation activities are required after
use. e number of copies that can be printed is not limited, and any unexecuted (blank) forms may
up
be discarded by the user without the quality unit being directly present in the process.
ro
Even though the quality unit does not need to be present for the destruction of unexecuted (blank)
forms, it must maintain oversight to ensure that procedures are followed (e.g., through approval of
G
Acceptable quality unit oversight may constitute all or part of the following:
Quality unit is responsible for approving the procedure,
Processes are reviewed in practice as part of the self-inspection program, and/or
Processes are reviewed in practice as part of the routine quality unit walkthrough of manufacturing.
am
Controlled issuance – Who individuals authorized Anyone
authorized by quality unit
by quality unit
Prevention Control
Full reconciliation of
Full reconciliation of records
N
records and pages
Reconciliation and pages based on unique No reconciliation
based on quantity
identifier
et
issued
Controlled Print Required Required Not required
Vi
Yes, with process
Bulk printing allowed? No in place to avoid Yes
in
misuse
Performed by the Performed by the
Destruction of blank forms
Performed by the issuing
unit, quality unit oversight P
operating or issuing individual, quality
M
unit, quality unit unit oversight
required
oversight required required
G
5.3.3 Manual Activities when Recording or Transferring Data between Paper and
U
Electronic Formats
E
Hybrid data or records are used in many manufacturing operations. An example of this is a system
that captures and stores electronic raw data and generates a printed report that is used during the
k:
documentation review and approval process. When this happens, the data in the electronic system and
o
the printed report (e.g., times, temperatures, critical parameters, pass/fail results) must be the same.
bo
In some situations where hybrid systems are employed, it is acceptable to differentiate the levels of
controls required based on the criticality of the activity. For manual activities, when recording or
ce
transferring data between paper and electronic formats, three scenarios were identified:
5.3.3.1 Data Accuracy when Manually Recording Data without a Controlled Second
Fa
Format
5.3.3.2 Transcription of Manually Recorded Data into an Electronic System
up
e controls are applied at the data level for the first and second scenarios, and at the record level
for the true copy scenario. All controls are designed to prevent a potential data integrity issue from
G
occurring.
5.3.3.1 Data Accuracy when Manually Recording Data without a Controlled Second Format
Table 5.3.3.1-1 applies when a manual recording (recording on paper or direct manual recording
into an electronic system, like a manufacturing execution system (MES)) is made without a con-
trolled second format available. is means that a reading, measurement, or result is observed from a
display or another source, and no audit trail, printout, photo, or other media is available to corrobo-
rate the data recorded. ese recommendations are intended for use while an organization is mov-
ing toward technological improvements that allow for a controlled second format such as electronic
audit trail, printout, photo, or other media.
am
For Medium-cri cal data points, for example a reading of temperature and humidity data for a
compression opera on, downstream veri ca on by a peer is recommended to ensure that the
requirements are met.
N
Low-cri cal data points, for example the start me of a secondary packaging opera on for
et
products not me- or temperature-sensi ve, only require a general check by a peer to en-
sure adherence to good documenta on prac ces. A second-person check for accuracy is not
Vi
required.
Table 5.3.3.1-1 Data Integrity Control Grid for Data Accuracy when Manually Recording without a Controlled
in
Second Format
Second check for data Four-eyes and Downstream verification General check of
recording – What downstream quality that raw data meets adherence to good
U
Second check for data Real time by peer. Peer review Peer review
recording – Who? Downstream quality unit
o
review
bo
Second check for data Real-time by peer. Before the next critical Before batch release
recording – When? process step or before or within timeframe
ce
system. is may occur, for instance, when data must be entered manually into an MES from a
printout because equipment is unable to connect to the MES.
ro
For High-criticality data, like a CPP, second-person verification against the initial recorded data must
G
For Medium-criticality data, like a reading of environmental data for a compression operation, the
second-person verification may be performed by a peer. For Low-critical data, like the time of start-
ing manufacturing, no second verification is needed.
am
N
5.3.3.3 True Copy (Paper to Electronic)
Table 5.3.3.3-1 illustrates the controls needed for creation of a true copy, going from a paper record
et
to an electronic record. True copies must be maintained using the same controls that applied to the
Vi
original paper record.
For High-criticality records, such as paper batch records that may be needed for downstream use by
in
a sister site or another department, a trained user can make the true copy, but a second person from
the quality unit must perform a documented review. e review must ensure the scan is legible, ac-
P
curate, and complete (all pages, all entries) and confirm that all entries on the true copy provide all
M
information and data that are contained on the original record. For example, if data is color-coded
or highlighted, the true copy must reflect the same color-coding or highlighting, with the intent of
G
capturing the context and meaning. In addition, the original record may be discarded with quality
unit oversight, unless the original document has a seal, watermark, or similar identifying mark that
U
cannot be accurately reproduced electronically, in which case it cannot be discarded. e quality unit
E
should determine the level of oversight required for destruction, which may range from quality unit
presence during destruction to periodic confirmation of compliance with procedures.
k:
For Medium-criticality records, like a validation protocol, a trained user can make the true copy. A
o
documented second-person review is required, though the second person need not be from the qual-
bo
ity unit. e original record may be discarded by the operating unit unless the original document has
a seal, watermark, or identifying mark. While the quality unit does not need to be directly involved
ce
in the destruction process, it must retain a suitable level of oversight to ensure the procedures are
followed.
Fa
For Low-criticality records, such as completed user access authorization forms where evidence of
training exists in another system, the person making the true copy must document that they re-
viewed the scan for legibility, accuracy, and completeness. e original may be discarded by the
up
individual, with quality unit oversight to ensure that procedures are followed.
ro
As with all controls relating to data integrity, processes relating to true copies, regardless of critical-
ity, must be documented in SOPs. A record must be maintained to reflect when each document was
G
completeness
am
Discard of original Yes, as defined by Yes, performed by Yes, individual can
allowed quality unit oversight, the operating unit, discard original
N
unless there is a seal, unless there is a seal, Quality unit oversight
watermark, or other watermark, or other required
identifier that can’t be identifier that can’t be
et
accurately reproduced accurately reproduced
Vi
electronically. electronically.
Quality unit oversight
in
required
For systems storing High-criticality data, such as an MES, historian, or filter integrity tester used
during sterile filling, unique identification and authentication controls should be in place for each
U
individual user. Other features also should be available for use, such as mandatory password changes
E
every 90 days, lock-out after five failed attempts, password recycling only allowed after 10 cycles, and
an annual review of user accounts.
k:
For systems storing Medium-criticality data, like parameter-setting or executed run files for a wash-
o
ing machine, many of the controls for systems storing Highly critical data likewise should be ap-
bo
plied. Certain measures can be less stringent: mandatory password changes can occur every 180 days
(instead of 90), passwords can be recycled after 5 cycles (instead of 10), and users locked-out after 10
ce
For Low-critical systems, for example, an HMI on a secondary packaging line that controls conveyor
speed but does not store data, the access controls may be far less rigorous. For these types of systems,
a passcode or group accounts are acceptable. Controls must be in place, however, to ensure segre-
up
gation of duties between user levels (i.e., administrator vs. engineer vs. operator). e passcode or
group accounts should be changed annually. ese controls may also be used for other Low-critical
ro
activities that do not require the action to be attributable. For example, a group account may be used
to press the start/stop button on a packaging line when the system requires a log-in to activate the
G
stop/start function.
Automatic log-out is not discussed as a differentiated control but should be considered as an overall
control factor. e amount of time that should be allowed to pass before an inactive user is automati-
cally logged-out should be commensurate with the risks identified, and documented accordingly
As stated above, the suggested frequencies at which certain activities should take place are illustra-
tive only. Organizations should determine the frequencies that are most applicable to their opera-
tions using a risk-based approach to ensure that the periodicity of activities is commensurate with
the risks identified. is assessment should consider the overall system architecture, the interfaces
between systems, criticality of access by unauthorized persons, and the potential for network security
breaches.
frequency
am
Periodic user account
Annually Annually Every 2 years
access review
Account lock-out after
N
5 incorrect password 10 incorrect password
repeated incorrect Never
entries entries
password entries
et
Password recycling
Vi
– Reuse of previously 10 cycles 5 cycles N/A
used passwords
in
5.3.5 Electronic Audit Trail Review
Audit trails are a form of metadata that records details such as the creation, addition, deletion, or
P
alteration of data within a system. An audit trail provides controls for secure recording of lifecycle
M
details, without overwriting the original record. Where equipment (e.g., instruments, devices) are
used to capture, create, process, report, store, or archive relevant GMP data electronically, the system
G
design must always include the provision and retention of a suitable audit trail.
U
Secure, computer-generated, time-stamped audit trails must allow for reconstruction of the history
E
of events relating to the record, including the “who, what, and when” of the action. Where techni-
cally possible, comments should be added, in the case of modifications or corrections, so that the
k:
why of the activity also is recorded as part of the audit trail. Where not technically possible to record
in the audit trail the reasons for changes or deletions to GMP-relevant data, alternate documentation
o
should be used.
bo
e electronic audit trail may comprise multiple sources related to the equipment; that is, the infor-
ce
mation may be held in files called something other than “Audit Trail,” for example, event log, alarm
log, system log, history file, trends, or reports. Understanding where the information is recorded
Fa
within the system, at both the operating system and application level, is important to help protect
the data and understand where in the system to look when performing an audit trail review.
In the following sections, audit trail review is discussed for three different types of data contained
up
Data associated with batch setup, for example, the recipe used, and the set-points or times
planned for specific activities during the batch
G
Batch run data, which is data from the manufacturing process, for example, temperatures, dura-
tion of activities, who performed the activities
System configuration data, which are not batch-relevant data but the data associated with the
configuration of the system, for example, user-account information, privileges granted to users,
where data is stored
Reviewing audit trails enables detection of potential data integrity issues. Some regulations require
that audit trails be reviewed on a periodic basis (2-3). ere is no expectation that every part of the
audit trail will be reviewed for every batch; rather, it is expected that a risk-based approach will be
taken to identify the appropriate level of review to conduct.
am
e ATRA focuses on the detectability of changes to the data, the criticality of the data (in terms of
patient safety or product quality), and the probability or capability for changes to the data to occur.
N
A cross-functional team should be assembled to conduct the ATRA. e team should comprise
people who fully understand the technical capabilities and limitations of the system, how the system
et
is set up (from a user management perspective), where the data is stored, what type of data is part of
the audit trail. e team also should include users who are familiar with operating the system.
Vi
e first step in the process is to score each of the three factors—detectability, severity, and prob-
in
ability—that may require review based on the system controls in place for each element in the audit
trail. For each factor, a lower score represents lower risk. Examples of scores that may be used are
P
provided in this section, although each entity may modify the scoring based on its own analysis and
M
risk tolerance.
Detectability Score is based upon the likelihood that modification or deletion of data will be de-
G
If it is possible to detect any modifications or deletions to the data contained within the audit
trail, and this data is already reviewed as part of an existing review as required by SOP, perform-
:E
ing additional review of the audit trail to detect potential data integrity issues is not required.
For example, if the batch report lists the critical parameters used during manufacturing and
ok
shows any changes made to these parameters during batch manufacturing, an audit trail review
of the same information need not be conducted. ( is assumes that the batch report is generated
bo
If the data has not already been reviewed or has only been partially reviewed (i.e., not all criti-
ce
cal data is in scope of the batch-wise data review), further assessment of the audit trail may be
required. Score = 1.
Fa
Severity Score is based upon the data criticality; it assesses the potential impact if data is modified
or deleted. ere are three possible Severity Scores: 10, 4, or 1.
up
If the data is considered Highly critical or has a direct impact on patient safety or product qual-
ro
ity (e.g., changing the temperature to make it appear it was within the validated range, despite
being outside the range), the potential impact is of high severity. Score = 10.
G
If the data has an indirect impact on patient safety or product quality, i.e., changes to data that
result in a compliance risk only, with indirect impact to only product quality or patient safety
(e.g., changes to time zone where the raw data and batch-related data is not affected), it is of
medium potential severity. Score = 4.
If the data has negligible or no impact on patient safety or product quality (e.g., the start time of
an operation, provided that this is not a CPP and does not affect duration), it is of low potential
severity. Score = 1.
Probability Score is based on the capability of the data to be modified or deleted by system users
and administrators. ere are three possible Probability Scores: 10, 2, or 1.
am
Next, the three scores (Detectability, Severity, and Probability) are multiplied to provide the ATRA
final score. e lower the score, the lower the risk.
N
Table 5.3.5.1-1 illustrates how the final score can be used to determine the needed frequency of
et
review. Some examples are provided in Section 5.3.5.2 that highlight how this can be used together
with different elements of the audit trail to determine the full audit trail review requirements for a
Vi
system.
in
Table 5.3.5.1-1 ATRA Final Score and Frequency Review
Frequency of
P
Needed Frequency of Audit Trail Review
M
System Use
0–10 Score 11–20 Score 40 Score 100 Score
G
Daily / Weekly Event-based Twice per year Weekly Batch-wise
U
e timeframes indicated are illustrative only. Organizations should assess their systems and apply
critical thinking when considering the criticality of the activity being performed as well as the poten-
o
tial for impact on product quality and patient safety. Using a risk-based approach, the periodicity of
bo
If during periodic review a number of data integrity breaches are discovered, the controls that are
(or should be) in place and/or the frequency of the review should be reevaluated to prevent future
breaches or to detect breaches sooner.
Fa
An audit trail contains data from (1) batch setup data, (2) batch run data, and (3) system configura-
tion data. Each of these parts of the audit trail have a number of elements that could be considered
ro
for audit trail review. Hence, the ATRA tool should be used for each individual element to deter-
mine the final score.
G
Elements for the batch setup data and batch run data parts of the audit trail, where data controls
may be vulnerable and may have an impact on critical batch data, might include:
Were any of the recipe parameters modified outside the validated range during the execution of
the batch?
Were changes made to CPPs (set points or allowed tolerance ranges) during manufacturing?
Were these within the validated ranges?
Were the calculation parameters edited or modified?
Were changes to the time made that impact the duration of (time-sensitive) parts of manufactur-
ing, i.e., an operation ran longer or shorter than was expected?
Similarly, for the system configuration data, a number of elements may be vulnerable and should be
considered when using the ATRA tool, such as:
Were changes made to the system date or time?
am
Was the report template or any batch report modified, edited, deleted, or renamed?
What changes have been made to user access accounts or the privileges assigned at the user level?
N
Do these changes still ensure segregation of duties?
Were any edits, modifications, or deletions made to the system configuration (e.g., administra-
et
tion changes, database settings, change of data storage location, system settings or functionality)?
Vi
Were there changes, edits, or deletions of audit trails, history logs, or files? Were any renamed?
Was any unusual activity recorded in the user activities log (login, logoff, unauthorized logins)?
in
Systems with high technical capabilities that are being fully utilized (e.g., strong access control, segre-
gation of duties) are less vulnerable, therefore, a significantly reduced frequency of audit trail review
P
is acceptable. On the other hand, if the system has limited technical capabilities (e.g., no difference
M
between administrator and operator privileges), the system is more vulnerable, and a more extensive
review of the audit trail would be appropriate.
G
Table 5.3.5.2-1 contains a few examples of how the ATRA tool can be used for a filter integrity
U
tester. e example does not represent a full ATRA review for a filter integrity tester but covers a few
:E
audit trail elements for demonstrative purposes to clarify the use of an ATRA.
e scores in these examples are based on the definitions in Section 5.3.5.1. A brief explanation is
ok
added to justify the scoring. e scores for detectability, severity, and probability are multiplied, giv-
ing a Final Score. en, the data in Table 5.3.5.1-1 is used to determine the frequency of audit trail
bo
am
Completing the ATRA for the other elements of the audit trail will allow the user to more fully un-
derstand the parts of the audit trail that require review and the necessary frequency of that review.
N
5.3.6 Backup of Electronic Data
et
e next category relates to preventive controls associated with the backup of electronic data. Two
Vi
different scenarios have been identified where it is acceptable to differentiate the controls applied
based on criticality.
in
e first scenario relates to the frequency of performing the backup of the manufacturing data stored
electronically. Companies should consider developing a principle-based backup strategy in order to
P
differentiate the frequency of backups. is strategy should take into account the company’s risk
M
tolerance, i.e., how manufacturing activities can be recreated in the event of a system failure when
data has not been backed up. If the only evidence of an activity that impacts the safety and/or quality
G
of the product is through the lost electronic data, the company must be prepared to reject the batch,
and a more frequent backup may be warranted. On the other hand, if the major activities can be
U
recreated using other means (e.g., printouts, manually recorded information), that may be used to
:E
In addition, the volume of data being generated within a given time period and the capability of
ok
the electronic system to store that volume of data without overwriting existing data should also be
considered. is situation will be further discussed in the second scenario.
bo
If the equipment is capable of automated backups, backups are recommended at the same frequency
ce
at which data is generated. If new data is being generated daily (e.g., using an MES or historian), a
daily backup is warranted.
Fa
e second scenario is associated with looped memory. For the purposes of this technical report,
looped memory is used in an electronic system with a limited storage capacity. Once the capacity
limit is reached, new data automatically overwrites older data. For example, in a compression ma-
up
chine capable of storing 10 batches worth of data, the data for Batch 11 automatically overwrites the
data for Batch 1.
ro
Table 5.3.6-1 describes controls for systems with a looped memory. For systems storing Highly
G
critical data, like certain makes or models of filter integrity testers used in sterile filling operations,
the backup should occur before the data is overwritten. A similar expectation should be in place for
systems storing Medium-critical data, such as a pH meter used on the shop floor. For systems storing
Low-critical data, like secondary packaging systems that do not store data relating to quality attri-
butes or such parameters as expiry dates, the backup should ensure that the data will not be overwrit-
ten before batch disposition has occurred.
Backup of Electronic Data High Data Criticality Medium Data Criticality Low Data Criticality
Prevention Control
am
5.3.7 Use of Electronic Data
Once electronic data is generated, the information may be further processed in numerous ways. In
N
the two scenarios described in Sections 5.3.7.1 and 5.3.7.2, the controls to be applied may be dif-
ferentiated based on criticality: when electronic data is exported to generate reports or records, and
et
when data is transferred or migrated between electronic systems. ese controls are targeted at the
Vi
data and record levels to help prevent a data integrity incident from occurring.
in
Table 5.3.7.1-1 identifies prevention controls related to reports and records generation. For reports
P
and records used to display High-critical data, such as a batch record from an MES used for making
release decisions, validation with change control and appropriate test plans is required for changes
M
to the report template. e validation should ensure that the report or record is accurate, and that
G
all data displayed will remain unchanged from the electronic raw data. IT or automation personnel
should perform these changes to the template and ensure that the content of the reports and records
U
For reports and records related to Medium-critical data, such as a building management system
monitoring temperature and humidity, IT or automation personnel should perform an initial valida-
tion. Any changes to the report template should be performed following SOPs, but revalidation of
ok
the template would not be required. at the report contains limited flexibility (via filters or drop
downs) and is working as intended is acceptable, so the user may run these reports based on a pre-
bo
For reports and records displaying Low-critical data, verification governed by work instructions
should suffice. e business unit creates these reports and records, and their content may be flexible
Fa
in nature, for example, exporting data into a data lake for future non-GMP analysis.
Regardless of the criticality, the quality unit should review and approve changes to report templates,
up
especially for any changes that could impact the information displayed.
ro
Table 5.3.7.1-1 Data Integrity Control Grid for Data Export for Generation of Reports and Records
G
am
One example is the migration of data from one platform to another as part of a software or hard-
ware upgrade, for example, when moving from one software version to another, moving from an old
server to a new server, or moving to the cloud is one type of transfer. is is generally considered a
N
once-off activity specifically associated with the upgrade.
et
A second example is the transfer of High- and Medium-critical batch data (e.g., CPPs, calibration
records, environmental monitoring data) from a standalone system to an upper-level system like an
Vi
MES or data historian. is type of data transfer occurs on a defined periodic basis (e.g., once a day,
once a week, after each batch) based on how the data transfer has been set up. For both examples,
in
the transfer or migration of the data should be safeguarded by change control, a test plan, and a tool
to verify that all the data have been transferred or migrated.
P
Data of varying criticalities are regularly transferred into different formats for making continuous
M
process verification conclusions, for example, transferring data into a data lake for further trending or
G
analysis. ese are considered Low-critical transfers, so a procedural control is adequate for describ-
ing the steps to be taken to transfer the data.
U
Table 5.3.7.2-1 Data Integrity Control Grid for Data Transfer and Migration Between Electronic Systems
E
Data Transfer and Migration High Data Criticality Medium Data Criticality Low Data Criticality
k:
executed to ensure
o
Checksum, or
Checksum, or equivalent
Verification of data equivalent tool, that
tool, that confirms all N/A
ce
Pharmaceutical manufacturing data is being collected at an accelerated rate, beyond human processing
G
capability using traditional mechanisms. ese challenges increasingly drive the need for governance and
active controls of data and information. Processes, roles, and standards must be created to allow the effec-
tive and efficient use of information and technology tools to enable an organization to achieve its goals.
A forthcoming PDA technical report will provide additional guidance on governance and controls
for big data in manufacturing. at technical report will address topics such as data classification,
data ownership, access management, source data, results and data insights, and change management,
within the framework of governance and controls of big data in manufacturing.
Collectively, these two technical reports will provide guidance on the management of data and big
data in a regulated pharmaceutical or biopharmaceutical facility.
am
14. F-D-C Reports, Inc. Data Manipulation is
3. Medicines and Healthcare products Regula- Being Uncovered and Referred for Criminal
tory Agency. ‘GXP’ Data Integrity Guidance and Investigation. e Gold Sheet 2007, 41 (4).
N
Definitions, Rev. 1; MHRA: London, 2018.
15. U.S. Food and Drug Administration. Warn-
4. World Health Organization. Annex 5: Draft ing Letter No. 320-19-29 to Indoco Remedies
et
Guidance on Good Data and Record Manage- Limited, dated 7/9/2019; Office of Compliance,
Vi
ment Practices; WHO: Geneva, 2016 p46. Center for Drug Evaluation and Research. U.S.
Department of Health and Human Services:
5. Pharmaceutical Inspection Convention/Co- Silver Spring, Md., 2019.
in
operation Scheme (PIC/S). Draft PIC/S Good
Practices for Data Management and Integrity in 16. U.S. Food and Drug Administration. Warn-
Regulated GMP/GDP Environments; PIC/S:
Geneva, 2018. P
ing Letter No. 320-20-15 to Apollo Health and
Beauty Care, Inc., dated 12/23/2019; Office of
M
Compliance, Center for Drug Evaluation and
6. Parenteral Drug Association. Technical Report Research. U.S. Department of Health and Hu-
G
No. 54-2: Implementation of Quality Risk Man- man Services: Silver Spring, Md., 2019.
agement for Pharmaceutical and Biotechnology
U
Manufacturing Operations, Annex 1: Case Study 17. U.S. Food and Drug Administration. 21 CFR
Examples for Quality Risk Management in Pack- 58 Good Laboratory Practice for Nonclinical
E
aging and Labeling; PDA: Bethesda, Md., 2013. Laboratory Studies; U.S. Department of Health
& Human Services: Washington, DC, 1978.
k:
System; ICH: Geneva, 2008. raphy Practices, Draft for Comments (July 2019)
bo
ment; ICH: Geneva, 2005. tion. Draft Drug Data Management Standard;
NMPA: Beijing, 2016.
Fa
and Reports; Government Publishing Office: TGA: Symonston ACT, Australia, 2017.
Washington, D.C., 2005.
21. Federal State Institute of Drugs and Good
ro
10. U.S. Food and Drug Administration. PART Practices. Guidelines: Data Integrity & Computer
133—Drugs; Current Good Manufacturing System Validation; SID&GP: Moscow, 2018.
G
am
29. European Commission. Annex 11: Comput-
No. 54: Implementation of Quality Risk Man- erised Systems, EudraLex – Volume 4 – Good
agement for Pharmaceutical and Biotechnology Manufacturing Practice for Medicinal Products
Manufacturing Operations; PDA: Bethesda,
N
for Human and Veterinary Use; European Com-
Md., 2012; p 60.
mission: Brussels, 2011.
et
Vi
8.0 Appendix I: Examples — How to Use the 9-Box
in
Vulnerability Grid
P
M
8.1 API Process Examples
G
e following examples demonstrate how High, Medium, and Low criticality data from various
API manufacturing processes reside in the 9-Box vulnerability grid based on the level of controls in
U
place. e vulnerability of data from each unit operation must be considered separately, because the
:E
data-capture features of various manufacturing equipment is different, the method of recording data
varies, and the data integrity controls in place will vary. Once evaluated, however, data vulnerability
ok
data and the existing data integrity controls in place. e criticality of the data will be either High,
Medium, or Low as defined in Section 4.2.1 Classification of Data Criticality. Table 4.2.1-1 con-
ce
tains definitions of the classes of data criticality. e criticality of the data determines the row of the
9-Box vulnerability grid in which the data will reside. As shown in Table 4.3-1, High-critical data is
Fa
in the top row, Medium-critical data in the middle row, and Low-critical data in the bottom row.
For the purposes of this technical report, the existing data integrity controls will be considered as
up
High, Medium, or Low based on the definitions in Table 4.2.2-1. e level of existing data in-
tegrity controls associated with the manufacturing process will determine in which column of the
ro
9-Box vulnerability grid the data belongs. Manufacturing processes with a High level of data integ-
rity controls in the form of computerized controls reside in the first column (moving from left to
G
right); manufacturing processes with a Medium level of data integrity controls in the form of hybrid
systems reside in the second column; and manufacturing processes with a Low level of data integrity
controls in the form of manual records reside in the third column.
As the user moves down the grid from top to bottom, the criticality of the data decreases. As a user
moves across the grid from left to right, the level of control decreases. e goal of decreasing the
vulnerability of the data is not impacted by the criticality of the data; instead, the goal of decreasing
the vulnerability of the data is achieved by increasing the controls.
Example 1 involves a reactor in an API manufacturing process (Table 8.1-1). e reaction needs
to be precisely controlled to maximize the production of the API intermediate while controlling an
unwanted and difficult-to-remove impurity that will impact a critical quality attribute (CQA) of the
am
release process. e High criticality of the data combined with the Low level of controls yields a
Significant data vulnerability risk (Red), as defined in Section 4.3.
N
Example 1b illustrates the same High-criticality data but with the reactor better controlled using a
validated programmable logic controller (PLC). e data integrity controls are increased with the
et
PLC printouts included with the batch processing record (BPR) for the review and release process,
rather than relying on the manual transcription of data from the human machine interface (HMI) of
Vi
the PLC into a paper batch production record. erefore, Example 1b yields Moderate risk of data
vulnerability (Orange).
in
Example 1c illustrates the same High-criticality data but with the use of additional controls: a vali-
dated electronic batch recording (EBR) with automatic data capture; the inability to erase any data;
P
the use of access controls, activated audit trails, and automated data archiving; and the ability to
M
trend the electronically stored production data. With these additional controls employed, the vulner-
ability of the High-criticality data has reached an Acceptable level (Green).
G
U
:E
ok
bo
ce
Fa
up
ro
G
Ex Mfg Operation G Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Manually controlled reactor and High: Impacts CQA & is a CPP that Human Factor: Manual transfer Low:
paper BPR controls impurity formation of CPP data from production
ro Human Factor: Supervisory review &
instrumentation & gauges into paper quantity unit review at release
BPR.
up
1a GMP Process: Controlled with Significant
GMP Process: Reactor is manually written procedures and training
controlled by operator
Data Mgmt: Reliance on second-
Data Mgmt: Paper BPR
Fa
person witnessing of data entries &
ce quality unit review at release
PLC-controlled reactor and paper High: Impacts CQA & is a CPP that Human Factor: Manual transfer of Medium:
BPR controls impurity formation CPP data from HMI into BPR Human Factor: Supervisory review &
bo
GMP Process: PLC-controlled quality unit review at release
o reactor of new or complex process GMP Process: PLC validated with
1b Data Mgmt: Manual data entry into printout attached to paper BPR Moderate
paper BPR from HMI; no audit trail
k:
Data Mgmt: Reliance on second-
or electronic analysis of data in PLC
E person witnessing of data entries;
U quality unit review of BPR & PLC
printout at release
PLC-controlled reactor & electronic High: Impacts CQA & is a CPP that Human Factor: Electronic BPR
G High:
BPR controls impurity formation mitigates human error to a residual Human Factor: EBR reviewed by
level M quality unit at release
GMP Process: PLC-controlled P GMP Process: PLC validated
1c reactor of robust process Acceptable
& production data captured
Data Mgmt: Data automatically automatically
captured in EBR & data mgmt
in
Data Mgmt: Access controls on
lifecycle is controlled PLC; electronic analysis of data;
automatic data storage & audit trail
Vi
Example 2 covers the oven drying of an API (Table 8.1-2). e loss on drying (LOD) of the API is not a CQA and the drying operation is not a CPP, but the
et
LOD does impact the flow characteristics of the API, which affects the downstream processing of the API into the finished product. erefore, the data regarding
N
the drying of the API is of Medium criticality and belongs in the middle row of the 9-Box vulnerability grid.
Example 2a describes Medium-criticality data with the manual operation of the drying oven, manual data capture, and manual transcription of data into a paper
am
manufacturing record. In addition, the existing data integrity controls are Low-level, with reliance on second-person witnessing of the manufacturing operation,
human oversight of the manufacturing process, and human review at release. e Medium criticality of the data plus the Low level of controls yields Moderate data
vulnerability (Orange), as described in Section 4.3.
In Example 3a, the cleaning is performed manually, then the Low-critical data is captured by the
operator who transcribes it into the paper manufacturing record. e reliance on second-person wit-
nessing, human oversight of the manufacturing process, and human review at release provide a Low
am
level of data integrity controls. e Low criticality of the data plus the Low level of controls yields
Acceptable data vulnerability (Green).
N
Example 3b illustrates the same Low-critical data but the level of controls in place are more than
required. e process is better controlled through a validated clean-in-place (CIP) process. e data
et
integrity controls have increased with the PLC printouts from the CIP system, which are included
with the BPR for the review and release process. e additional controls yield Negligible data vulner-
Vi
ability (Blue)
Example 3c illustrates the same Low-critical data but with the addition of a validated EBR with
in
automatic data capture, inability to erase any data, activated audit trails, automated data archiving,
and ability to trend the electronically stored production data. e vulnerability of the data remains
P
unchanged, and the vulnerability remains at a Negligible level (Blue). In this example, there are
M
significantly more controls in place than are required.
G
U
:E
ok
bo
ce
Fa
up
ro
G
G Data Vulnerability
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place
Level
Manual batch-to-batch cleaning Low: Batch-to-batch carryover Human Factor: Manual recording of Low:
ro
of equipment within a campaign will not impact the quality of the cleaning into paper BPR Human Factor: Supervisory review &
final API for validated campaign GMP Process: Equipment cleaning quality unit review at release
up
length manually performed by operator
3a GMP Process: Controlled with written Acceptable
Data Mgmt: Paper BPR procedures and training
Fa
Data Mgmt: Reliance on second-person
witnessing of data entries & quality
unit review at release
ce
CIP batch-to-batch cleaning of Low: Batch-to-batch carryover Human Factor: Manual transfer of Medium:
equipment within a campaign will not impact the quality of the cleaning data from HMI into BPR
bo
Human Factor: Supervisory review &
final API for validated campaign
o GMP Process: PLC-controlled CIP quality unit review at release
length cleaning process
3b GMP Process: PLC validated with Negligible
k:
Data Mgmt: Manual data entry into printout attached to paper BPR
paper BPR from HMI; no audit trail or
E Data Mgmt: Reliance on second-person
electronic analysis of data in PLC
U witnessing of data entries; quality unit
review BPR & PLC printout at release
PLC-controlled CIP batch-to- Low: Batch-to-batch carryover
G
Human Factor: EBR mitigates human High:
batch cleaning & electronic BPR will not impact the quality of the error to a residual level
M Human Factor: EBR reviewed by
final API for validated campaign GMP Process: PLC-controlled CIP quality unit at release
length cleaning process is robust
P GMP Process: PLC validated & cleaning
3c Negligible
Data Mgmt: Data automatically data captured automatically
captured in EBR & data mgmt lifecycle
in Data Mgmt: Access controls on PLC,
is controlled electronic analysis of data, automatic data
storage & audit trail
Vi
et
N
am
am
Example 4a illustrates High-criticality data with the operator manually inserting analog humidity
data from a chart recorder and transcribing this data to a paper manufacturing record. A Low level
of data controls exist, relying on a second person to verify the data. e High criticality of the data
N
combined with the Low level of controls creates Significant data vulnerability (Red), as defined in
Section 4.3.
et
Example 4b illustrates the same High-criticality data. In this example, the data controls have in-
Vi
creased compared to Example 4a. e humidity data is now digitized and captured using a validated
building management system (BMS), using audit trail and access controls. Also, a printout of the
in
humidity data is attached to the manufacturing record. e additional controls yield Moderate data
vulnerability (Orange).
P
Example 4c illustrates the same High-criticality data with the additional controls of a validated
M
electronic manufacturing record with automated data capture, audit trail, and access controls. ese
G
additional controls yield Acceptable vulnerability (Green).
U
:E
ok
bo
ce
Fa
up
ro
G
Ex Mfg. Operation G Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Humidity data is displayed and ro High: Humidity is a CPP that Human Factor: Analogue Low:
recorded on a paper chart and impacts CQA; drug substance humidity data is added from Human Factor: Supervisor review
a paper manufacturing record is is hygroscopic and will degrade chart recording, and data is and quality unit review at release
used with elevated moisture levels transcribed to paper record
up
4a GMP Process: Chart paper is Significant
GMP Process: Humidity data is attached to paper record
captured manually
Data Mgmt: Second-person
Fa
Data Mgmt: Periodic manual verification of data entry and
entry of data into BPR quality unit review at release.
ce
Humidity data is displayed and High: Humidity is a CPP that Human Factor: Digital humidity Medium:
recorded using BMS, and a impacts CQA; drug substance data is printed out from HMI, Human Factor: Supervisor
bo
paper manufacturing record is is hygroscopic and will degrade
o and data is transcribed to paper review and quality unit review at
used with elevated moisture levels record release
GMP Process: Humidity data GMP Process: HMI printout
k:
4b is captured both manually and
E is attached to paper record; Moderate
electronically validated BMS with audit trail
U
Data Mgmt: Periodic manual and access controls
data entry into paper BPR Data Mgmt: Second-person
G verification of printout and
M quality unit review at release
Humidity data is displayed and High: Humidity is a CPP that Human Factor: An electronic P High:
recorded using BMS, and a impacts CQA; drug substance manufacturing record mitigates Human Factor: Use of electronic
paper manufacturing record is is hygroscopic and will degrade human error to a residual level record minimizes manual entry
used with elevated moisture levels GMP Process: Humidity data is
in
of data
captured electronically GMP Process: Validated BMS
4c Data Mgmt: Humidity data is with audit trail and access Acceptable
Vi
automatically uploaded into controls; automated data
electronic manufacturing record capture in electronic record
et
Data Mgmt: Humidity data
N
is managed per Part 11
requirements
am
Example 5a illustrates Medium-criticality data with the operator manually interpolating analogue humidity data from a chart recorder and transcribing the data
ro
to a paper manufacturing record. In addition, a Low level of data controls exist, relying on a second person to verify the data. e Medium criticality of the data
combined with a Low level of control creates Moderate data vulnerability (Orange).
up
Example 5b illustrates the same Medium-critical data, but the data controls have increased compared to Example 5a. e humidity data is now digitized and
captured using a validated BMS, with audit trail and access controls. Also, a printout of the humidity data is attached to the manufacturing record. e additional
Fa
controls yield Acceptable data vulnerability (Green). ce
Example 5c illustrates the same Medium-criticality data with the added controls of a validated electronic manufacturing record with automated data capture, audit
trail, and access controls. e additional controls yield Negligible vulnerability (Blue).
bo
Table 8.2-2 Example 5: Medium Criticality – Humidity Monitoring During Sieving of a Drug Substance
Ex Mfg. Operation
o
Data Criticality k: Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Humidity data is displayed and Medium: Humidity is not a CPP; Human Factor: Analogue humidity data Low:
recorded on a paper chart; a paper at elevated humidity levels, drug is interpolated from chart recording, Human Factor: Supervisor review and quality
manufacturing record is used substance will become sticky, giving
E
and data is transcribed to paper record
U unit review at release
5a poor flow properties GMP Process: Humidity data is GMP Process: Chart paper is attached to Moderate
captured manually G paper record
Data Mgmt.: Periodic manual entry Data Mgmt: second-person verification of
of data into BPR
M data entry. Quality unit review at release
Humidity data is displayed and Medium: Humidity is not a CPP; Human Factor: Digital humidity dataP Medium:
recorded using BMS; a paper at elevated humidity levels, drug is printed out from HMI, and data is Human Factor: Supervisor review and quality
manufacturing record is used substance will become sticky, giving transcribed to paper record unit review at release
poor flow properties GMP Process: Humidity data
in
GMP Process: HMI printout is attached to
5b Acceptable
is captured both manually and paper record; validated BMS with audit trail
electronically and access controls
Vi
Data Mgmt: Periodic manual data Data Mgmt: Second-person verification of
entry into paper BPR printout and quality unit review at release
et
Humidity data is displayed and Medium: Humidity is not a CPP; Human Factor: An electronic High: N
recorded using BMS; an electronic at elevated humidity levels, drug manufacturing record mitigates Human Factor: Use of electronic record
manufacturing record is used substance will become sticky, giving human error to a residual level minimizes manual entry of data
poor flow properties GMP Process: Humidity data is GMP Process: Validated BMS with audit trail
am
5c Negligible
captured electronically and access controls; automated data capture
Data Mgmt: Humidity data is in electronic record
automatically uploaded into Data Mgmt: Humidity data is managed per
electronic manufacturing record Part 11 requirements.
Example 6a illustrates Low-critical data with the operator manually adding analogue humidity data from a chart recorder and transcribing this data to a paper
ro
manufacturing record. In addition, a Low level of existing data controls exist, and a second person must verify the data. e Low criticality of the data combined
with a Low level of controls creates an Acceptable data vulnerability risk (Green).
up
Example 6b illustrates the same Low-critical data; however, the data controls have increased compared to Example 6a. e humidity data is now digitized and
captured using a validated BMS, with audit trail and access controls. Also, a printout of the humidity data is attached to the manufacturing record. e additional
Fa
controls yield a Negligible data vulnerability risk (Blue). ce
Example 6c illustrates the same Low-critical data with the added controls of a validated electronic manufacturing record with automated data capture, audit trail, and
access controls. e vulnerability of the data remains unchanged at a Negligible level (Blue). In this example, significantly more controls are in place than are required.
bo
Table 8.2-3 Example 6: Low Criticality – Humidity Monitoring During Sieving of a Drug Substance
Ex Mfg Operation
o
Data Criticality k: Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Humidity data is displayed and Low: Humidity is not a CPP; drug Human Factor: Analogue humidity data Low:
recorded on a paper chart; a paper substance is not hygroscopic is interpolated from chart recording, and Human Factor: Supervisor review and
manufacturing record is used. and is very stable with good flow
E
data is transcribed to paper record
U quality unit review at release
6a properties over a wide humidity GMP Process: Humidity Data is captured GMP Process: Chart paper is attached to Acceptable
range manually paper record
G
Data Mgmt: Periodic manual entry of data Data Mgmt: Second-person verification of
into BPR
M data entry; quality unit review at release
Humidity data is displayed and Low: Humidity is not a CPP; drug Human Factor: Digital humidity data
P Medium:
recorded using BMS; a paper substance is not hygroscopic is printed out from HMI, and data is Human Factor: Supervisor review and
manufacturing record is used and is very stable with good flow transcribed to paper record quality unit review at release
properties over a wide humidity GMP Process: Humidity data is captured
in GMP Process: HMI printout is attached to
6b range Negligible
both manually and electronically paper record; validated BMS with audit
trail and access controls
Data Mgmt: Periodic manual data entry
Vi
into paper BPR Data Mgmt: Second-person verification of
printout; quality unit review at release
et
Humidity data is displayed and Low: Humidity is not a CPP; drug Human Factor: An electronic High: N
recorded using BMS; an electronic substance is not hygroscopic manufacturing record mitigates human Human Factor: Use of electronic record
manufacturing record is used and is very stable with good flow error to a residual level minimizes manual entry of data
properties over a wide humidity GMP Process: Humidity data is captured GMP Process: Validated BMS with audit
am
6c range Negligible
electronically trail and access controls; automated data
capture in electronic record
Data Mgmt: Humidity data is
automatically uploaded into electronic Data Mgmt: Humidity data is managed per
manufacturing record Part 11 requirements
am
to verify filter identity before sterilizing filtration, there is no regulatory requirement in the U.S. to
perform such a test. In the EU, the regulatory requirements may differ, as Annex 1 states that “ e
integrity of the sterilised filter should be verified before use and should be confirmed immediately
N
after use by an appropriate method….” (2). ese examples are intended to clarify the application of
the 9-Box grid; care should be taken to conform to local regulatory requirements.
et
Examples 7–9 demonstrate data integrity risk levels for filter integrity testing with three potential
Vi
scenarios.
In Example 7, the data is considered to be of High criticality because the filter integrity testing im-
in
pacts CQA (sterility) and it is a CPP that confirms the absence of viable microorganisms after aseptic
filling, which is a regulatory expectation (Table 8.3-1).
P
M
In Example 7a, the data vulnerability level is Significant, as defined in Section 4.3, due to the high
level of human intervention and because the legacy filter integrity tester has either no ability or a lim-
G
ited ability to store electronic data; a paper printout is used in lieu of electronic data (Red).
U
In Example 7b, the data vulnerability becomes Moderate when the potential for human errors is
reduced by using newer technology that features improved functionality to ensure automated data
:E
Example 7c shows that the data vulnerability is Acceptable when human intervention is reduced to
a minimum with the filter testing performed in-line, without the operator having to initiate the test
manually, and the test results are automatically transferred to a secure data storage module (Green).
bo
ce
Fa
up
ro
Filter Integrity
Tester
G
Ex Mfg. Operation G Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Aseptic Filling: High: Impacts CQA (sterility) and is a CPP that Human Factor: Manual transcription of set-up data Low:
con rms absence of viable microorganisms by operator
ro
Filter integrity tester with paper print-out Human Factor: Supervisor reviews test results prior to
after aseptic lling
test results GMP Process: Filter integrity tester is manually batch release
7a controlled by the operator Signi cant
up
GMP Process: Controlled with written procedures and
Data Mgmt: Legacy lter integrity tester used with training
no/limited ability to store electronic data; paper Data Mgmt: Reliance on peer review of data entries
printout used in lieu of electronic data
Fa
and test results
Aseptic Filling: High: Impacts CQA (sterility) and is a CPP that Human Factor: Automated data entry and data back- Medium:
con rms absence of viable microorganisms up mitigates human error to a residual level
ce
Filter integrity tester with automated data Human Factor: Supervisor checks test run alert data
after aseptic lling prior to batch release
entry and storage and electronic test results GMP Process: Filter integrity tester is manually
controlled by the operator
bo
GMP Process: Test equipment and associated
7b Moderate
Data Mgmt: Latest lter integrity tester is used, has automation interfaces validated, and process
o ability to interface with bar code readers to capture robustness monitored regularly
set-up data, has secure data storage module, and can Data Mgmt: Reliance on automation, validation, and
k:
record operator’s actions, including number of test
E robust monitoring process
runs, electronically and also on the paper printout
Aseptic Filling: High: Impacts CQA (sterility) and is a CPP that Human Factor: End-to-end automation mitigates High:
con rms absence of viable microorganisms
U
human error to a residual level
Filter integrity tester with end-to-end Human Factor: Supervisory review is limited to only
automation after aseptic lling G
GMP Process: Filter integrity testing instruments investigation of failed test
integrated into the unit operation
7c
M GMP Process: In-line test equipment and associated Acceptable
Data Mgmt: Filter is tested in-line without operator interfaces validated, and process robustness
having to initiate the test manually; test results are
P monitored regularly
automatically transferred to a secure data storage Data Mgmt: Reliance on automation, validation, and
module in the robust monitoring process
Vi
et
N
am
Example 8c shows that the data vulnerability becomes Negligible (Blue) when human intervention is reduced to a minimum with the filter testing performed in-
Fa
line, without the operator having to initiate the test manually, and test results are automatically transferred to a secure data storage module.
Table 8.3-2 Example 8: Medium Criticality – Filter Integrity Testing at the Point of Fill (Pre-Use–Post-Sterilization)
ce
Ex Mfg. Operation Data Criticality bo Data Vulnerability Factors Data Controls in Place Data Vulnerability Levels
Aseptic Filling: Medium: Verifies filter integrity Human Factor: Manual transcription of set-up Medium:
after sterilizing filtration but odata by the operator Human Factor: Supervisor reviews test
Filter integrity tester with paper
before aseptic filling results prior initiating aseptic filling process
printout test results GMP Process: Filter integrity tester is manually
8a controlled by the operator GMP Process: Controlled with written Moderate
k:
Data Mgmt: Legacy filter integrity tester used
E procedures and training
with no/limited ability to store electronic data;
U Data Mgmt: Reliance on peer review of
paper printout is used in lieu of electronic data data entries and test results
Aseptic Filling: Medium: Verifies filter integrity Human Factor: Automated data entry and data
G High:
Filter integrity tester with after sterilizing filtration but back-up mitigates human error to a residual level Human Factor: Supervisor checks test run
before aseptic filling alert data prior to aseptic filling process
automated data entry and GMP Process: Filter integrity tester is manually
storage and electronic test controlled by the operator
M GMP Process: Test equipment and
8b
results
P
Data Mgmt: Latest filter integrity tester is used
associated automation interfaces
validated, and process robustness
Acceptable
that has the ability to interface with bar code
monitored regularly
readers to capture set-up data, secure data
in
storage module, and record operator’s actions, Data Mgmt: Reliance on automation,
including number of tests runs electronically and validation, and robustness monitoring
also on the paper printout process
Vi
Aseptic Filling: Medium: Verifies filter integrity Human Factor: End-to-end automation mitigates High:
after sterilizing filtration but human error to a residual level Human Factor: Supervisory review is limited
et
Filter integrity tester with end-
to-end automation before aseptic filling GMP Process: Process robustness assured by to only investigation of failed test runs
N
automation, validation, and monitoring practices; GMP Process: Test equipment and
8c redundant sterilizing filters installed associated automation interfaces validated, Negligible
and process robustness monitored regularly
am
Data Mgmt: Filter is tested in line without the
operator having to initiate the test manually, and Data Mgmt: Reliance on automation,
test results are automatically transferred to a validation, and robustness of monitoring
secure data storage module process
Example 9c shows that the data vulnerability is reduced significantly and remains Negligible (Blue) when human intervention is reduced to a minimum with the
Fa
filter testing performed in-line, without the operator having to initiate the test manually, and the test results are automatically transferred to a secure data storage
module. While this example is theoretically possible to put into practice, it is operationally unlikely at the current time.
ce
Table 8.3-3 Example 9: Low Criticality – Filter Identity Veri cation at the Point of Fill (Pre-Use Pre-Sterilization)
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
bo
Aseptic Filling: Low: Verifies filter identity
o
Human Factor: Manual transcription of set-up data by High:
to support correct filter the operator Human Factor: Supervisory review is limited to only
Filter integrity tester with
investigation of failed tests
k:
paper print-out test results installation GMP Process: Filter integrity tester is manually
9a controlled by the operator
E GMP Process: Controlled with written procedures
and training
Acceptable
Data Mgmt: Legacy filter integrity tester used with
U
no/limited ability to store electronic data; paper Data Mgmt: Reliance on peer review of test results
printout is used in lieu of electronic data G
Aseptic Filling: Low: Verifies filter identity Human Factor: Automated data entry and data back- High:
Filter integrity tester with
to support correct filter
installation
up mitigates human error to a residual level
M Human Factor: Supervisory review is limited to only
investigation of failed tests
automated data entry and GMP Process: Filter integrity tester is manually P
storage and electronic controlled by the operator GMP Process: Test equipment and associated
test results Data Mgmt: Latest filter integrity tester is used that automation interfaces validated, and process
9b Negligible
robustness monitored regularly
has the ability to interface with bar code readers to
in
capture set-up data, secure data storage module, and Data Mgmt: Reliance on automation, validation, and
record operator’s actions, including number of tests the robustness of monitoring process
runs electronically and on the paper printout
Vi
et
Aseptic Filling: Low: Verifies filter identity Human Factor: End-to-end automation mitigates High:
Filter integrity tester with to support correct filter
installation
human error to a residual level
N
Human Factor: Supervisory review is limited to only
investigation of failed tests
end-to-end automation GMP Process: Process robustness assured by
automation, validation, and monitoring practices; GMP Process: Test equipment and associated
am
9c Negligible
redundant sterilizing filters installed automation interfaces validated, and process
Data Mgmt: Filter is tested in line without operator robustness monitored regularly
having to initiate the test manually, and test results are Data Mgmt: Reliance on automation, validation, and
automatically transferred to a secure data storage module the robustness of monitoring process
am
Where the vulnerability is Significant or Moderate, steps should be taken to mitigate the vulner-
abilities. e dispensing process, for example, could be improved by avoiding the use of spreadsheets,
using a barcode/scanning/enterprise resource planning interface to confirm the correct identify of
N
materials, or adding a real-time double-check that the correct quantity of the materials has been
dispensed. Both the equipment set-up and packaging operation could be improved by increasing
et
the access controls, so each user has a unique account, or by adding minimum–maximum product-
Vi
specific limits to the tolerance ranges for sensors that cannot be modified by the operator.
For the batch reconciliation, using a barcode/scanning/validated electronic system interface could
in
replace the manual logbook for identifying when a new component is required.
P
M
G
U
:E
ok
bo
ce
Fa
In Process Control
Planning
Testing
u p
ro
G
Planning Packaging
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Ex Mfg. Operation Data Criticality Data Vulnerability Factors Data Controls in Place Data Vulnerability Level
Medium:
G
Batch reconciliation: Batch
yield is calculated, along with
High:
If not performed correctly,
Human Factor: Operators can stop the
batch early, or add rejected materials back Human Factor: Double verification by a
reconciliation of unused packaging potential to impact patient safety into the line, impacting the reconciliation peer for each packaging component used,
ro
materials process and quality unit review of reconciliation
GMP Process: Batch yield is calculated process
up
using counters (pass/reject information), GMP Process: Controlled with written
that are part of the batch report; procedures and training; operators need
10g reconciliation of packaging components is to complete a logbook each time a new Moderate
completely manual process package component is required, making
Fa
a use history of packaging components
Data Mgmt: Operators use logbooks to
record the use of packaging components available
ce
during the batch; possible to miscalculate Data Mgmt: Equipment automatically
the quantity of packaging materials used generates a report at the end of
bo
packaging; report template and data are
locked and no one can edit the contents
o k:
8.5 References for Appendix I E
1. U.S. Food and Drug Administration. Guidance for Industry: Sterile Drug Products Produced by Aseptic Processing—Current Good Manufacturing Practice; U.S.
U
Department of Health and Human Services: Rockville, Md., 2004.
G
2. European Commission. Annex 1: Manufacture of Sterile Medicinal Products, EudraLex – Volume 4 – Good Manufacturing Practice for Medicinal Products for
Human and Veterinary Use Consultation Document. European Commission: Brussels, 2017. M
P
in
Vi
et
N
am
am
Make a reasonable number of photocopies of a printed PDA Electronic Publication for the individual use of an
Authorized User
N
Authorized Users are not permitted to do the following:
Allow anyone other than an Authorized User to use or access the PDA Electronic Publication
et
Display or otherwise make any information from the PDA Electronic Publication available to anyone other
than an Authorized User
Vi
Post the PDA Electronic Publication or portions of the PDA Electronic Publication on websites, either available
on the Internet or an Intranet, or in any form of online publications without a license from PDA, Inc.
in
Transmit electronically, via e-mail or any other file transfer protocols, any portion of the PDA Electronic
Publication
P
Create a searchable archive of any portion of the PDA Electronic Publication
Use robots or intelligent agents to access, search and/or systematically download any portion of the PDA
M
Electronic Publication
G
Sell, re-sell, rent, lease, license, sublicense, assign or otherwise transfer the use of the PDA Electronic Publication
or its content
U
Use or copy the PDA Electronic Publication for document delivery, fee-for-service use, or bulk reproduction or
distribution of materials in any form, or any substantially similar commercial purpose
E
Alter, modify, repackage or adapt any portion of the PDA Electronic Publication
Make any edits or derivative works with respect to any portion of the PDA Electronic Publication including any
k:
text or graphics
o
Delete or remove in any form or format, including on a printed article or photocopy, any copyright information
bo
am
broad perspective reflecting best thinking and practices currently available.
The final working draft is next reviewed by the PDA Advisory Board or Boards that sanctioned the project. PDA’s
N
Advisory Boards are: Science Advisory Board, Biotechnology Advisory Board, and Regulatory Affairs and Quality
Advisory Board. Following this stage of review, the PDA Board of Directors conducts the final review and deter-
et
mines whether to publish or not publish the document as an official PDA Technical Report.
Vi
While PDA goes to great lengths to ensure each Technical Report is of the highest quality, all readers are encouraged
to contact PDA about any scientific, technical, or regulatory inaccuracies, discrepancies, or mistakes that might be
found in any of the documents. Readers can email PDA at: [email protected].
in
Sanctioning Advisory Board: RAQAB PDA Officers and Directors
Jacqueline Veivia-Panter, Hitachi Chemical (Chair) Officers P
M
Jeff Broadfoot, Emergent BioSolutions Inc Chair: Jette Christensen, Novo Nordisk
G
(Immediate Past Chair) Chair-Elect: Susan Schniepp, RCA
Anette Yan Marcussen, NNE (Vice Chair) Treasurer: Melissa Seymour, Biogen, Inc.
U
Directors
Mirko Gabriele, ermofisher Scientific Masahiro Akimoto, Otsuka Pharmaceutical Factory, Inc.
bo
Aaron Goerke, Ph.D., Roche Barbara Allen, Eli Lilly and Company
Tor Gråberg, AstraZeneca Michael Blackton, Adaptimmune
ce
Bethesda Towers
U
Suite 600
Bethesda, MD 20814 USA
ok
E-mail: [email protected]
Web site: www.pda.org
ce
Fa
up
ro
G