0% found this document useful (0 votes)
13 views14 pages

Ongoing Analytical Procedure Performance Verification Using A Risk Based Approach To Determine

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views14 pages

Ongoing Analytical Procedure Performance Verification Using A Risk Based Approach To Determine

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

This article is licensed under CC-BY-NC-ND 4.

pubs.acs.org/ac Perspective

Ongoing Analytical Procedure Performance Verification Using a


Risk-Based Approach to Determine Performance Monitoring
Requirements
Phil J. Borman, Amanda M. Guiraldelli,* Jane Weitzel, Sarah Thompson, Joachim Ermer,
Jean-Marc Roussel, Jaime Marach, Stephanie Sproule, and Horacio N. Pappa
Cite This: Anal. Chem. 2024, 96, 966−979 Read Online
See https://pubs.acs.org/sharingguidelines for options on how to legitimately share published articles.

ACCESS Metrics & More Article Recommendations *


sı Supporting Information

ABSTRACT: The analytical procedure life cycle (APLC) provides


a holistic framework to ensure analytical procedure fitness for
purpose. USP’s general chapter <1220> considers the validation
Downloaded via 186.1.16.197 on July 9, 2024 at 18:54:48 (UTC).

activities that take place across the entire analytical procedure


lifecycle and provides a three-stage framework for its implementa-
tion. Performing ongoing analytical procedure performance
verification (OPPV) (stage 3) ensures that the procedure remains
in a state of control across its lifecycle of use post validation
(qualification) and involves an ongoing program to collect and
analyze data that relate to the performance of the procedure.
Knowledge generated during stages 1 (procedure design) and 2
(procedure performance qualification) is used as the basis for the
design of the routine monitoring plan to support performance
verification (stage 3). The extent of the routine monitoring required should be defined based on risk assessment, considering the
complexity of the procedure, its intended purpose, and knowledge about process/procedure variability. The analytical target profile
(ATP) can be used to provide or guide the establishment of acceptance criteria used to verify the procedure performance during
routine use (e.g., through a system/sample suitability test (SST) or verification criteria applicable to procedure changes or transfers).
An ATP however is not essentially required to perform OPPV, and a procedure performance monitoring program can be
implemented even if the full APLC framework has not been applied. In these situations, verification criteria can be derived from
existing validation or system suitability criteria. Elements of the life cycle approach can also be applied retrospectively if deemed
useful.

■ INTRODUCTION
Robust and reliable analytical procedures are required across
the fitness of an analytical procedure during the development
phase as well as to help define the validation (or qualification)
many manufacturing industries, including the pharmaceutical, criteria of the developed procedure. Analytical procedures used
fine and specialty chemical, food, and petrochemical industries. to test pharmaceutical products are typically validated in
These industries rely on fit-for-purpose analytical procedures accordance with the International Council for Harmonization
over many years to ensure that routinely manufactured products (ICH) Q2(R1) guideline10 or USP <1225>.11 Validation,
are of high quality. Many of these industries use ISO based however, is often treated as a one-off event,12 with little
accreditation or certification to ensure that the reportable values consideration given to verifying how well the procedure will
are fit-for-purpose. For example, ISO/IEC 17025 explicitly perform in everyday, “real world” operating conditions.
includes the fit-for-purpose requirement. Analytical procedure Regulators and industry frequently use ICH Q2(R1)10 or USP
failures can result in a delay or inability to deliver products to <1225>11 in a “check box” manner without considering the
customers or, worse, lead to unacceptable products being intent of these guidance documents, or the philosophy of
released due to reportable results incorrectly appearing to be
within specification. In the pharmaceutical industry, this can
have severe consequences such as being unable to deliver critical Received: August 18, 2023
medicines to patients. The ATP as described by Jackson et al. Revised: November 21, 2023
and others1−9 can be established to summarize the performance Accepted: November 21, 2023
requirements associated with a measurement on a quality Published: January 8, 2024
attribute (or multiple attributes), which need to be met by an
analytical procedure. The ATP can be used to define and assess
© 2024 The Authors. Published by
American Chemical Society https://doi.org/10.1021/acs.analchem.3c03708
966 Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

analytical procedure validation.13 Performing OPPV ensures • Establishment of data sampling strategy and definition of
that the procedure remains in a state of control across its assessment frequency
lifecycle of use post validation (qualification). This should • Establishment of data processing and analysis methodology
involve an ongoing program to collect and analyze data related (e.g., data analysis and visualization such as trending charts,
to the procedure performance. The importance of this control charts, and other Statistical Analytical Procedure
continued verification is recognized by its inclusion in the ISO Performance Monitoring and Control (SAPPMC) techniques;
standards, such as the ISO/IEC 17025 requirements for data analysis methodologies using statistical tools)
ensuring the validity of results. The analytical procedure life • Establishment of suitable performance monitoring and
cycle (APLC) provides a holistic framework to ensure analytical controlling rules (limits/acceptance criteria)
procedure fitness for purpose. USP’s general chapter <1220> • Definition of a reaction plan to guide actions (e.g., need for
entitled Analytical Procedure Life Cycle5 (official as of May 1, additional training, SOP review/creation, updating existing
2022 in USP-NF), describes a three-stage approach similar to analytical control strategy, trending/analyzing additional
FDA’s lifecycle-based process validation guidance14 and high- performance indicators not previously monitored
lights the interconnectivity of all stages and activities that are The procedure performance monitoring may include the
intended to demonstrate procedure fitness for use, allowing for definition of the strategy to assess rates of failures and the root
continuous risk and knowledge management. The APLC cause for invalid tests.
framework consists of the ATP and three stages, procedure In stage 3, the ATP can be used to provide or guide the
design (stage 1), procedure performance qualification (stage 2), establishment of acceptance criteria used to verify the procedure
and ongoing procedure performance verification (stage 3).5,9 performance, assess the risks, and determine the probability of
The enablers of this approach, which are considered enhanced success and failure. If an ATP has previously been established,
from the traditional approach, are knowledge of quality risk then performance criteria can be verified directly (e.g., through
management (QRM) and application of sound scientific precision and accuracy of a quality control sample with the same
approaches. This allows for in-depth knowledge acquisition of replication strategy as for the reportable value) or indirectly
the procedure. The analytical procedure life cycle approach is through ensuring that key analytical procedure attributes (e.g.,
consistent with quality by design (QbD) concepts described in precision levels, chromatographic resolution between peaks,
ICH Q guidelines (Q8−13).15−20 The new draft ICH Q1421 is tailing factor of peaks, etc.) linked to the ATP are within
the first to address QbD principles applied to procedure lifecycle appropriate limits.22 OPPV can still be performed even if an
management and is similar to stage 1 while covering part of stage ATP has not been previously defined, and elements of the life
3 according to USP <1220>. Q2(R2) can be seen, at least in part, cycle approach can also be applied retrospectively if deemed
as similar to stage 2 described in USP <1220>. ICH Q14 states useful. In these situations, verification criteria can be derived
that “in the enhanced approach the control strategy is derived from existing validation or system suitability criteria. However,
from systematic and risk-based evaluation of analytical retrospective establishment of an ATP is recommended to have
procedure parameters and an understanding of potential impact a clear alignment with the requirements. For low-risk
on analytical procedure performance. The analytical procedure procedures, verification may simply involve monitoring of
parameters that need to be controlled to ensure the performance atypical results and/or system suitability failures.6
requirements described in the ATP are met and their acceptable This paper focuses on describing the importance of the OPPV
ranges should be described in the procedure”.21 OPPV provides in the APLC framework to ensure the procedure is fit for use.
data and information about the procedure performance over The following sections cover strategies that support the design
time and involves monitoring the analytical procedure during of the performance monitoring plan along with the different
routine use, thus providing ongoing confidence that the elements that should be contained in the monitoring program.
reportable values generated are fit-for-purpose.22 In addition,
ongoing verification can be used to assess the impact of changes
to analytical conditions, ensuring the modified procedure will
produce reportable values that meet performance criteria that
■ IDENTIFICATION OF ANALYTICAL PROCEDURES
WHICH REQUIRE PERFORMANCE MONITORING
THROUGH RISK ASSESSMENT
can be defined in an ATP and are fit-for-purpose.5,8,9,22
It can also serve as a preventive measure by providing an early The performance of all analytical procedures used in the product
indication of potential performance issues or adverse trends, control strategy (quality control testing) should be reviewed
aiding in the identification of potential changes for the analytical across the lifecycle of the product to ensure they remain fit for
procedure. The overall knowledge generated during stages 1 and their intended purpose.5,10,23 This review should include routine
2 is used as a basis for the design of the routine monitoring plan monitoring (for prioritized procedures) as well as the evaluation
to support performance verification. The extent of the routine of changes made to procedures.23 The extent of the routine
monitoring required should be defined based on risk assessment, monitoring required can be defined using a risk-based approach.
considering the complexity of the procedure, its intended When designing a risk assessment for an analytical procedure, it
purpose, and knowledge about process/procedure variabil- is important to consider which measures of analytical procedure
ity.22,23 Risk assessment will allow for the identification of performance or risk are readily available as inputs to the
analytical procedures that require performance monitoring and assessment. Two risk assessment approaches are described
the extension of monitoring, facilitating the selection of suitable below.
indicators. Steps for the design of the procedure performance Simple Risk Assessment Approach. A simple risk
monitoring will usually include: assessment approach may focus only on the type and complexity
• Assessment of risk associated with the analytical procedure of the analytical procedure and its intended use, enabling the
• Selection of appropriate analytical procedure performance establishment of a platform for risk assessment according to the
indicators and attributes (based on risks identified during stages analytical procedure (technique) type. Analytical procedures
1 and 2) that do not form a critical part of the product control strategy
967 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

can be treated as low risk by default. Ermer et al.23 proposed the l


o USL LSL | o
following strategy for risk categorization: Ppk = mino m
o , o
}
o
o 3
n overall 3 o
overall ~ (1)
• Low risk: Qualitative and semiquantitative tests (e.g., visual
tests, simple standard tests, limit tests) where USL and LSL are the upper and lower specification limits,
• Medium risk: Medium risk quantitative tests, less complex μ is the mean and σ is the overall standard deviation. If a drug
procedures (e.g., compendial standard tests, water determi- product has good overall process performance for a particular
nation, residual solvents, simple assay techniques (UV or critical quality attribute (CQA), i.e., the test results show little
chemical titration), kinetics of viral inactivation on cells, total variation versus the specification, then it can be assumed that the
protein content) analytical procedure precision is good.
• High risk: High risk quantitative tests, assays, and more Assessment of Analytical Procedure Performance
complex procedures (e.g., assay and related substances by liquid through PTOL or Z-Score. The calculation of PTOL or Z-
chromatography, bioassays, in vivo potency assays) Score metrics (described below) involves the use of the
Data-Driven Risk Assessment Approach. This may analytical procedure standard deviation and deliberately
include an assessment of the current performance of the excludes the manufacturing process variability. It is important
manufacturing process (which includes variability from the to consider this context when defining thresholds for high,
product and the analytical procedure) through an assessment of medium, and low risk procedures, as per Table 1. If it is preferred
process performance (Ppk) if enough representative batches of
the manufacturing process (and analytical procedure) are Table 1. Data-Driven Risk Assessment for the Identification
available. A more involved risk assessment approach may of High-Risk Analytical Procedures
include an assessment of the precision of the specific analytical
procedure relative to the product specification or ATP.
Precision-to-tolerance ratio (P/TOL)24 and Z score25 are
examples of analytical procedure capability metrics that can be
calculated to aid this assessment. Conformity and validity rates
and/or procedure robustness and reproducibility10,26 can also
be taken into account. Where procedures have been shown to be
sensitive to changes (e.g., analyst, equipment, environment,
reagent lots, or small changes in procedure parameters), this
presents a higher risk. Figure 1 depicts sources of variability
associated with the process performance index (Ppk) and
procedure capability metrics (such as P/TOL and Z-score).

to incorporate an estimate of manufacturing process variability


into the calculations, then the process mean can be substituted
for maxima or minima of the process range, either based on an
adequate data set or on estimation (e.g., the process range is
assumed to cover half of the specification range).
Precision to tolerance ratio (PTOL) is described by Chatfield
Figure 1. Different precision levels and a representation of the total and Borman24 (eq 2).
procedure and process variability. 3a
PTOL1 sided = max of
(USL process mean)
Assessment of Manufacturing Process Performance 3a
through Ppk Assessment. Process performance index “Ppk” or
(process mean LSL) (2)
(eq 1) can be used as a worst-case surrogate measure for
procedure precision. It represents the combination of where σa is the analytical procedure standard deviation, USL is
manufacturing process variability and analytical procedure the upper specification limit, and LSL is the lower specification
variability. limit). Low PTOL values are desirable.24
968 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

The Z-score25 (eq 3) is defined as the number of procedure result in an invalidation of the whole analysis and may be
standard deviations between the mean batch result and the classified as “validity”.23 The conformity and validity categories
nearest specification limit. are of a qualitative nature, and graphical presentation or control
charts are not necessarily needed for monitoring performance.
|(SL process mean)| Both categories may be evaluated as a total number during the
Z=
a (3) evaluation period or as a rate with respect to the total number of
analyses, depending on the latter.
where SL is the nearest specification limit and σa is the standard Analytical Procedure Performance Attributes or
deviation of the analytical procedure. The analytical procedure Parameters. These measurements, collected during testing,
standard deviation (σa) can be derived from a combination of could be critical analytical procedure attributes or performance
precision studies such as repeatability, intermediate precision, parameters specified in the analytical procedure with well-
ruggedness studies,23 and data generated from routine use of the defined acceptance criteria or ranges, respectively, or procedure
procedure. High absolute value Z scores are desirable. attributes and performance parameters selected due to a
A data-driven risk assessment that takes into account process potential link with procedure performance. Such numerical
and analytical procedure capability is shown in Table 1. When data lend themselves well to routine monitoring by means of
performing a risk assessment, analysts should be aware that it is control charts. However, if the attributes are primarily used for
possible that a low-risk procedure could become a high-risk adjustment purposes, then they are not well suited for the
procedure during a subsequent periodic risk assessment, due to monitoring program of the analytical procedure. Whenever
changes in manufacturing process which impact mean batch replication is performed, the precision at the respective level can
results, or changes to specification limits. There may not have be monitored. Examples include the following:
been a change in the procedure precision (σa). However, the • The range (i.e., the difference between minimum and
level of risk associated with the procedure is rated highly due to maximum result) or RSD of replicate standard or sample
the proximity of the batch data to the specification limit(s), and injections of the same preparation solution will provide
therefore consideration of procedure performance monitoring is information on the system variability.
recommended to ensure that a drift in procedure performance • The range or RSD of independent replicates of standard or
does not further increase the risk of out of specification batch sample preparations.
results. In such cases, it may not be possible or practical to • Replicate series or runs (common for bioassays) will provide
improve procedure performance in order to reduce the risk of information on the (intermediate) factors varied between the
out of specification batch results. It may be more appropriate to runs (usually calibration as a major contribution).
consider manufacturing process improvements or review of the Analyzing a material which is representative for the sample in
specification. question (control sample, QC check) will address not only

■ RISK-BASED PERFORMANCE MONITORING PLANS


FOR ANALYTICAL PROCEDURES
precision but also (long-term) accuracy, as a shift from the target
value (average) can be evaluated (bias). If replication is
performed, all precision levels can be covered. Where the
Analytical Procedure Performance Indicators. Different same replication strategy is used27,28 as for the generation of the
analytical procedure performance indicators can be monitored reportable value,23 the precision of the reportable value can be
and used to obtain information about the analytical performance monitored and compared directly to the ATP requirement.
on a continuous basis. Ermer et al.23 proposed the classification Other examples are given in Table 6. Numerical analytical
of performance indicators into the following classes: (1) procedure attributes may also include resolution, peak
conformity indicators, (2) validity indicators, and (3) analytical symmetry, signal-to-noise ratio, root-mean-square error of
procedure performance attributes and parameters. regression, coefficient of correlation/determination of regres-
Conformity Indicators. Conformity indicators are the sion, etc. As indicated by the definition, SST related attributes
number of out-of-specification (OOS) results with an analytical are often dominated by the equipment used, and the data may
procedure root cause in a given time period. The number and vary between instruments. Consequently, monitoring and
types of analytical procedure errors will provide information evaluation could become difficult depending on the number of
about the reliability of the analytical procedure. In the case of instruments used. In such cases, monitoring can be performed
comparison to established acceptance limits, the violation of for individual or selected instruments (as far as the number of
such limits indicates performance problems. Reportable values data are sufficient e.g., more than 20 results per year).
outside of the specification limits may also be caused by Monitoring of such SST attributes will only provide useful
manufacturing root causes, but analytical root causes (laboratory information in case of a substantial contribution to the overall
error) may reveal the poor performance of the analytical performance. However, monitoring of equipment performance
procedure. The OOS-procedure is a basic GMP requirement is a benefit on its own and may be established as an ongoing
and must be generally established. Thus, this information is instrument performance qualification program.3 Trending of
readily available and may be classified as “conformity”.23 such instrument/procedure attributes may also facilitate
Validity Indicators. Validity indicators are usually the detection of adverse trends or an appropriate time point to
number of invalid test results due to failures of system or sample take an action to avoid procedure performance issues (e.g.,
suitability criteria (usually established during stages 1 and 2 of change of chromatographic columns, instrument wearable parts,
the lifecycle approach based on risk assessment) in a given time etc.)
period or termination of the analysis due to other reasons (e.g., Identification of Procedure Performance Indicators
not yielding the reportable value). Violation of analytical Based on Risk Assessment. The extent of routine monitoring
procedure acceptance criteria established for validity indicators required can be defined using a risk-based approach. For low-
(e.g., system suitability limits (SST) or sample suitability risk analytical procedures, monitoring of conformity indicators
assessment (SSA)21) is also straightforward information and will is usually sufficient, whereas for high-risk procedures additional
969 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

monitoring of validity indicators and monitoring of informative are most familiar with the procedures should be included in the
analytical procedure attributes and/or performance parameters discussion. It can also be useful to include experienced analytical
can provide early warning of potential changes in procedure scientists who are not familiar with the particular procedure
performance. Table 2 describes applicable performance under assessment and/or technique subject matter experts.
Augmented reality technology can be a great way to orient all
Table 2. Performance Indicators for the Different Procedure meeting participants with the procedure under assessment and
Risk Categories (Adapted from Reference 23) to demonstrate any particularly problematic steps. It is not
applicable for procedure risk- practical to monitor all potential sources of variability (e.g., there
performance indicator categories may be digital challenges), so it is important to prioritize the
conformity low-, middle-, and high-risk most value adding procedure performance indicators for the
procedures monitoring plan. Heat maps or failure mode effects and
validity middle-a and high-risk
procedures criticality analysis (FMECA)30 are great tools for prioritization
analytical procedure performance attributes/ middle-a and high-risk of the sources of variability identified in the brainstorm. Once
parameters procedures sufficient analytical procedure platform knowledge has been
a
If considered a priority procedure based on the wider product/use acquired, creation of generic Ishikawa diagrams, process maps,
context. or FMECA can prove useful starting points and serve to
accelerate this step. A PPC&ER can be performed for new high-
indicators for procedure risk categories. Once any high-risk risk procedures in stage 1 (as part of procedure development, in
procedures have been identified, a useful way to identify the support of procedure ruggedness and robustness studies and/or
most value adding procedure attributes and/or performance design of the analytical procedure control strategy). If this is the
parameters to monitor is to perform a procedure performance case, then it is prudent to develop the first version of the
cause and effect review (PPC&ER). This detailed review helps monitoring plan at the time of analytical procedure control
to develop an analytical team understanding of the likely root strategy development as the analytical procedure control
causes of any high risks identified from the risk assessment. With strategy and monitoring plan should be complementary. For
an understanding of the likely root causes, potential improve-
established (e.g., commercial) analytical procedures where stage
ment ideas and tailored procedure performance indicators can
be documented for further consideration (and potentially taken 1 was not followed or documented, a new PPC&ER can be a
forward for inclusion in the monitoring plan). good first step toward the design of a monitoring plan. Two
The format of PPC&ER meeting(s) and any reports examples are provided below.
generated will depend on the nature of the risks flagged in the Example 1. The PPC&ER highlighted that a chromato-
risk assessment. Both USP <1220> and ICH Q14 describe the graphic procedure for the determination of impurities has
use of Ishikawa or fishbone diagrams for risk identification. challenges with sensitivity. In this case, a good procedure
These can be useful tools for collective brainstorming of performance indicator to prioritize for monitoring in a control
potential sources of variability associated with a procedure. chart would be the signal-to-noise of a sensitivity standard
Gemba29 is very important for this activity, and the analysts that analyzed on each test occasion. If the control chart can be

Figure 2. General flowchart of an analytical procedure, potential sources of data, and performance indicators to be considered in the routine
performance monitoring program.

970 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

filtered by instrument ID, then this can provide even further • calculation of long-term parameters and periodic assess-
insights. ment frequency
Example 2. If a bioassay or similarly complex assay procedure Different strategies can be used for data collection. These
is known to have complicated and variable sample preparation strategies usually include (a) information and parameters
steps, then a good procedure performance indicator to prioritize recorded in other electronic systems or entered manually in
for monitoring in a control chart could be a quality control paper format, which would require a verification of the transfer
sample or reference sample analyzed on each test occasion. If the from raw data or instrument data systems, and (b) use of a
control chart can be filtered by analyst ID then this can provide laboratory information management system (LIMS), facilitating
great insight into potential further training requirements. data sampling and trending and mitigating data integrity issues.
Sources of Performance Data and Information. The In the latter case, the information on validity indicators and the
general flowchart of an analytical procedure (Figure 2) reveals data for the analytical procedure performance attributes and
starting points to retrieve data and information about its parameters are available within a LIMS, which is able to create
performance. The selection of type of data and information summary reports for each category and to generate graphical
depends on the given analytical technique and the analytical presentations or control charts, as this would avoid the transfer
procedure design. For example, in the case of titrations, no of data. In the case of validity information, where a graphical
replicate analysis of the same test solution is possible, or relative presentation or control charts are not essential, a paper-based
standard deviations (RSD) are only appropriate to be used recording might be a simple and efficient solution, depending on
(individually) in case of sufficient repetitions (e.g., at least six). the frequency of violations. In the case of additional electronic
Test results combine the unit operation steps after the validity systems, the question of data integrity should be considered. In
assessment for each sample preparation/measurement. Analyz- this case, a reasonable compromise should be achieved with
ing a sample from a manufacturing process will always include respect to the validation and review details as the aim is not
both analytical and manufacturing variability. GMP batch release or decision.
For the purpose of analytical performance monitoring, the The frequency of the performance assessment of the analytical
variability of the manufacturing process should be excluded as procedure should be defined. It can be defined depending on the
much as possible. Examples of excluding manufacturing amount of data available in order to have a sufficient number of
variability are the use of replicate results in batch analysis or data available for a reliable assessment. Typically, the maximum
analysis of control batches (control samples). Only the results frequency of any monitoring assessment will be annually to align
for the control sample (acquired using the defined replication with the GMP requirements for an annual product review, which
strategy for the test sample) can be compared directly to the should include or refer to the analytical procedure assessment. In
ATP. Control samples are materials with a (well-known) matrix case of a higher testing frequency, the assessment can be made
composition close to that of the test sample (identical to or quarterly or biannually, but there should be ideally 20−30
adequately representative of the test samples, e.g., a homoge- batches available.26,30,31 Refer to the section Performance
neous batch of product tested alongside each routine sample), Assessment and Continuous Improvement of Analytical
analyzed along with test samples in order to evaluate the Procedures for additional discussion on performance assess-
accuracy of the analytical procedure, helping to ensure that the ment/evaluation and plan of actions.
analyses are properly performed so that the results obtained with Continuous Verification of the ATP Requirements. The
test samples are reliable. They should be stable over time and be ATP requirements relate to the reportable value and can
available in sufficient amounts. If the manufacturing variability is therefore usually not be verified directly, being a challenge for
negligible with respect to the analytical variability, batch samples ATP criteria verification during routine analysis. For quantita-
may also be used as “virtual” control samples (e.g., assay of many tive procedures, measurement uncertainty (precision and bias)
small molecule API, and for some well-established and can be considered as the primary performance characteristic.5,32
controlled drug product manufacturing processes). All other A maximum allowable measurement uncertainty can be defined
performance attributes (including a single determination of a in the ATP as performance requirements for a given measure-
control sample) address only some of the unit operation steps ment, and continuous verification of the ATP requirements may
and thus allow only an indirect comparison to ATP by alignment help to ensure fit for use along the entire procedure life cycle.
of the performance levels. Usually, during procedure design, control strategies are defined
Data Sampling and Establishment of Assessment to eliminate or minimize systematic errors, or bias.32 Precision
Frequency. As part of the pharmaceutical quality system (random variability) should be assessed in procedure design and
(PQS), OOS results with a laboratory root cause (conformity qualification32 allowing the selection of conditions with reduced
indicators) are usually tracked. If the monitoring program is variability and determination of suitability of the analytical
established as a regular part of the PQS, a monitoring plan is procedure. Each step of an analytical procedure contributes to its
required for each analytical procedure to be considered. The variability to the overall precision. Precision levels can be used to
plan should include (at least) the following: described groups of procedure sources of variation: system
• all defined performance indicators, according to risk precision, repeatability, and intermediate precision/reproduci-
• the source of the data and monitoring tools (e.g., control bility32 (Figure 1).
and monitoring charts, etc.) Precision. In general, analytical performance attributes
• the data sampling rate (e.g., for each release testing, for each extracted from routine analysis may be used to express the
stability testing, for each series/run, for each instrument precision of the reportable value, which may be directly
separately, or defined only, once a week, etc.) compared with the ATP precision requirement (which includes
• the monitoring frequency (review and evaluation of all precision levels). However, this is only valid if a control
indicators or control charts within a given period of time, e.g., sample is included in the monitoring program and analyzed by
each month, quarter depending on the volume of testing) the same replication strategy as the test sample, allowing the
• monitoring rules or control limits/out-of-control rules precision of the reportable value to be directly obtained as a
971 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

long-term average parameter and compared to the ATP random variability should trigger a reaction plan and steps taken
precision requirement. Usually, analytical performance attrib- to bring the procedure back into a state of control. SAPPMC
utes address mainly the variability of various steps of the (the term coined in this article analogous to statistical process
analytical procedure, i.e., the precision levels and not the overall control (SPC) for product and process) is a tool for achieving
procedure precision, imposing a challenge to compare few this objective and can help to quickly detect shifts so that
analytical performance attributes/performance directly with corrective and preventative actions can be actioned. Control
ATP precision criteria (e.g., RSD or ranges of injection/ charting is the simplest and most common technique used to
measurement of sample or reference standard, sample implement SAPPMC.
preparation, etc.). If analytical performance attributes expressing Control Charting. Control charts are powerful tools for
only precision levels lower than the reportable value (overall monitoring an analytical procedure and generally consist of an
procedure variability) are available, a simple way to assess the upper control limit (UCL), a center line, and a lower control
ATP requirement for the reportable value is obtaining the limit (LCL).37−39 These are defined based on data collected
average of precision levels (e.g., system precision, repeatability, from the process while it is in a state of control and in alignment
or intermediate precision/reproducibility). It is worth mention- with the ATP. They are particularly powerful for signaling when
ing that, in case of a reportable value for a single sample something has changed or gone wrong or when the procedure is
preparation, the intermediate precision/reproducibility corre- doing something different. To gain the most benefit, it is critical
sponds to the precision of the reportable value.28,32 An to plot data in real time and decisions made immediately with
acceptable repeatability or intermediate precision/reproduci- regard to the next steps. When the process is in a state of
bility may even be larger than the ATP requirement, as in the statistical control, almost all of the data points will be within the
case of averaging the different precision levels (variances), lower and upper control limits, varying randomly around the
variability is reduced. If there are no shifts and trends in the center line. When a point falls outside of the lower or upper
precision parameters as well as in the average parameters, the control limits, it can be assumed that the process is out of
maintenance of the validated status is verified. If the statistical control, and this may trigger an action. Actions may be
manufacturing variability is negligible with respect to the minor or major and will depend on the severity and impact of the
analytical variability (e.g., less than 1/4), the precision of the violation. We may also add additional rules that can trigger an
reportable value can be directly calculated from batch results action before a value falling outside the control limits is
(“virtual” control sample). This may be the case for the assay of observed. These rules allow the early identification of systematic
many small molecule API, and for some well-established and and nonrandom trends that suggest the process may be shifting,
controlled drug product manufacturing processes. Reliable even if there are no observed values outside of the limits yet.
precision can also be obtained from other sources,28 for Nonrandom patterns will almost always have a root cause that
example, by extraction from routine analysis such as duplicate may be identified with careful investigation, and corrective
injections or sample preparations,33 system suitability tests, or actions can be implemented to bring the process back into a state
stability studies.34 of statistical control. To understand the implementation of
Accuracy (Bias). Accuracy can be addressed directly only if control charts, one must have a fundamental understanding of
the control sample has been established by independent the hypothesis testing and sample distributions. Every point that
characterization. However, as the control sample results will is plotted on a control chart can be thought of as a hypothesis
reveal time-dependent shifts and trends, the relative accuracy test being performed.40 If the point falls within the control limits,
can be controlled, where in the case of no shifts and trends, it we fail to reject the null hypothesis that the process is in control.
may be assumed that the maintenance of the accuracy was If the point falls outside the control limits, we reject the null
demonstrated in validation. hypothesis and conclude that the process is out of control,
Legacy Procedures without ATP. In principle, the ATP triggering our action plan. The control limits are defined based
should be continually verified along the entire procedure on the alpha level of our hypothesis test, thus controlling for the
lifecycle. This verification is challenging for legacy procedures risk of making an incorrect decision, such as triggering an
without a defined ATP containing previously established investigation when the process is actually in a state of control.
performance requirements. For these procedures, the ATP The risk of stopping the analytical process for an unnecessary
requirement may be established retrospectively based on investigation (type I error) must be weighed against the risk of
existing data available. Approaches for setting limits and not detecting a shift in the analytical process (type II error). This
procedure performance evaluation may include the estimation risk tolerance will dictate how the control limits are defined.
of tolerance intervals (TI) using the validation data35,36 and/or There are several types of control charts that are commonly
using current control data (for procedures with a long period of used, the most common ones being listed in Table 4. The choice
use and associated specifications) and comparing it to a defined of which control chart to use depends on the type of data and the
range calculated using the measurement uncertainty (MU) from design of the analytical procedure (replication). The reason for
historical control data.23 The ATP criteria are said to be met if selecting the type of control chart and the data used to establish
the width of the TI is less than the range of the MU. the control chart should be documented.

■ STATISTICAL ANALYTICAL PROCEDURE


PERFORMANCE MONITORING AND CONTROL
Shewhart Control Charts for Continuous Variables. If
the quality characteristic is measured on a numeric scale, the
most used control charts are the X-bar (X̅ , mean) and range (R)
(SAPPMC) or the X-bar and standard deviation (S). Here, both the mean
Trend plots of critical procedure performance indicators (e.g., value and variability of the characteristic are monitored. This
RSD from system precision checks, results from routine testing, approach for monitoring numeric data is appropriate for
control or stability samples, or out-of-trend investigations) can normally distributed data. This approach is also valid for non-
be established. A state of statistical control is achieved when only normally distributed data due to the central limit theorem, which
random causes of variation are present29 and departures from states that the distribution of sample means approximates a
972 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

normal distribution as the sample size gets larger, regardless of Table 3. Control Limits for Mean, Range, Standard
the population’s distribution.41 If the data are not normally Deviation, and Individual Control Charts, According to ISO
distributed and small sample sizes are used, then other 7870-2:202340 (Adapted with permission from reference 40,
approaches for control charting should be explored. It should ISO 7870-2:2023. Copyright 2023 ISO)
also be considered that analytical procedures usually are
Control charts for mean and range or standard deviation
composed of many steps, thus resulting in an “analytical”
averaging. Estimated control limits Prespecified control limitsa
The X-bar (X̅ ) chart limits are defined as follows (eq 4, eq 5, Center Center UCL and
and eq 6): Statistic line UCL and LCL line LCL
X̅ X̿ X̿ ± A2R̅ and X̿ ± A3S̅ b μ0 μ0 ± Aσ0
Center Line (CL): CL = (4) R R̅ D4R̅ , D3R̅ d2σ0 D2σ0, D1σ0
S S̅ B4S̅ , B3S̅ c4σ0 B6σ0, B5σ0
Upper Control Limit (UCL): UCL = +Z /2 Control charts for individuals
n (5)
Estimated control limits Prespecified control limits
Lower Control Limit (LCL): LCL = Z /2 Center Center UCL and
n (6) Statistic line UCL and LCL line LCL
where μ is the mean, σ is the standard deviation, n is the sample individual, X X̅ X̅ ± 2.660R̅ m, 0 μ0 μ0 ± 3σ0
size, and Zα/2 is the upper α/2 percentage point of the standard moving range, Rmc R̅ m 3.267R̅ m, 0 1.128σ0 3.686σ0, 0
a
normal distribution. If 3-sigma limits are used, Zα/2 is simply μ0 and σ0 are given values or parameters. bOnly applicable when the
replaced by 3. If a value falls outside of the UCL or LCL, then we interseries variability is negligible, i.e., repeatability equals inter-
would become suspicious that the mean is no longer equal to μ, mediate precision. cRm, the moving range, is obtained by the
since we would expect the values to fall within the control limits difference between two consecutive individual values.
99.73% of the time when using 3-sigma limits. Since μ and σ are
unknown, they are estimated from preliminary samples taken information or information taken from previous data from
while the process is in a state of statistical control, for example, at procedure development or validation.
the completion of the analytical procedure validation. Assuming One should be aware that the X-bar control charts estimated
there are m samples available, each with n observations of the limits are based on the range or standard deviation obtained
quality characteristic, let x̅1, x̅2, ..., x̅m be the average of each from the series of measurements and that the interseries impact
sample. Thus, the estimate of μ is the mean of the means (eq 7): on the mean value of each of the different series is not
x + x2 + ... + xm considered. When interseries effects are not negligible, the
X= 1 estimated limits might not reflect the real variation of the mean
m (7)
values and prespecified limits or the 3-standard deviation (of the
We must also estimate σ to construct the upper and lower means) limits might be preferred. In this case, μ0 and σ0 can be
confidence limits, using either the standard deviations or the taken as the mean and standard deviation of mean values from
ranges of the m samples with n observations. When using ranges, previous data from a control chart with estimated limits or a
the range for each sample m and the mean range are calculated as series of measurements obtained in intermediate precision
eq 8 and eq 9, respectively: conditions.
R1 = max{x1, ..., xn} min{x1, ..., xn} Cumulative-sum (cusum) Control Charts for Monitor-
(8)
ing the Mean. Cusum chart is an alternative to a Shewhart
R1 + R 2 + ... + R m control chart and is particularly useful for detecting small shifts
R= and may also be used when the sample sizes are n = 1. Since the
m (9)
cumulative sums include information from the current sample as
Using the above estimates, the control limits for the X-bar, R, S well as the prior samples, these charts are more sensitive than
charts, and individual values charts are given in Table 3. Shewhart charts for detecting small shifts. However, it should be
The different constants (e.g., A2, D3, D4, etc.) are available for noted that cusum control charts are not as efficient at detecting
various sample sizes and are provided in the Supporting large shifts as the Shewhart control charts. Cusum charts are
Information. created by plotting
When n is small, use of the range is similar to the use of the i
standard deviation; however, as n increases, the range is no Ci = (xj )
longer appropriate and the standard deviation should be used 0
j= 1
instead for monitoring the variability of a quality characteristic.
Due to the simplicity of calculating the range and considering for sample i, where x̅j is the average of the jth sample and μ0 is the
that samples are often small in practice, the X-bar and R plots are target for the mean. When the analytical procedures is in a state
used commonly; however, when computer software is available, of statistical control, the cumulative sums will vary randomly
the use of the standard deviation is recommended. around a value of 0. If the quality characteristic mean shifts to a
According to the ISO 7870-2 standard,40 control charts with value > μ0, the plot will show an upward drift and conversely a
estimated limits are used to estimate whether the mean range or downward drift if the mean shifts to a value < μ0.
standard deviation values differ from their total mean “by an Exponentially Weighted Moving-Average Control
amount greater than that which can be attributed to chance Charts. The exponentially weighted moving-average
causes only”, while control charts with prespecified limits add (EWMA) control chart is similar to the cusum chart in that it
the requirement regarding the center and dispersion of is sensitive to small shifts and may be used when the sample size
measurements. The prespecified values may be based on is n = 1. The EWMA control chart is generally easier to set up
experience obtained by using control charts with no prior than the cusum chart.
973 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

Table 4. Control Chart Comparisons


subgroup
chart type size (n) advantages disadvantages
X-bar average ≥2 can easily detect large shifts in the mean; popular, lots of knowledge not very sensitive to small shifts in the mean without
on its use, including rules to allow early detection of smaller shifts additional rules
and/or trends
R variability ≥2 can easily detect large shifts, easy to compute ignores a lot of information about the variability when
the sample size is large; not recommended for
subgroups > 10
S variability ≥2 can easily detect large shifts; uses all information about the more difficult to compute and interpret than the
variability; optimal estimate of variability for larger subgroup sizes range
individuals NA 1 can detect large shifts slow to detect drifts in the mean, not ideal for
(X) detecting small shifts; requires normality
assumption
Cusum average ≥1 can detect small shifts, sooner complicated; not as good as X-bar chart for detecting
large shifts
EWMA average ≥1 can detect small shifts, sooner complicated; not as good as X-bar chart for detecting
large shifts; requires the setting of subjective
parameters

The EWMA is defined as zi = λxi + (1 − λ)zi−1, where λ is a context of analytical method process control. These are outside
constant between 0 and 1 and the starting value is the process the scope of this review; however for a comprehensive review of
target (i.e., z0 = μ0). Often the average of preliminary data is used these charts as well as those described above, refer to ref 39.
as the starting value of the EWMA. If the observations xi are Table 4 provides a comparison between the most common
independent random variables with variance σ2, the variance of control charts.
the EWMA is Establishment of Trending Rules. The main goal of
control charts is to check that the process under surveillance is
2i
j yz
2
zi = jj zz[1 (1 )2i ] stable; this means that the statistics plotted on the graph, be it
k2 { the standard deviation, S, or the range, R, for dispersion statistics,
Thus, the EWMA control chart is as follows (eq 10, eq 11, and eq or the individual values, X, or the mean value, X-bar, for position
12): statistics, would stay in a defined zone. Several rules have been
proposed to interpret the control charts information; they are
jij zyz[1
based on limits defined by the process standard deviation, or a
UCL = 0 +L j z (1 )2i ] target standard deviation. Three zones are defined, being plus or
k2 { (10)
minus 1, 2, or 3 times the standard deviation or being based on
CL = the calculated control limits. There are two well-known set of
(11)
0
rules, the Westgard rules,43 commonly used with the individuals
control charts (X charts)44 and the Western Electric rules, also
ij yz
LCL = 0 L jj zz[1 (1 )2i ] known as AT&T rules, commonly used with the Shewhart
k2 { (12) control charts (X-bar, S, or R control charts).40 The latter rules
are described in detail in the ISO 7870-2 standard.40 In the case
where the factor L is the width of the control limits. of SAPPMC, it may be assumed that several measurements are
The choice of L and λ can be made to closely approximate the available at each step; therefore the use of rules based on mean
performance of a cusum chart for detecting small shifts. and dispersion values, such as Western Electric - ISO 7870-240
Montgomery39 found that values of λ in the interval 0.05 to rules, could be preferred. Table 5 displays examples of these
0.25 work well in practice, with λ = 0.05, λ = 0.10, and λ = 0.20 rules for control limits according to ±3s. Note that an essential
being popular choices. Smaller values of λ detect smaller shifts. prerequisite for these rules is that the results are independent.
In general, L = 3 represents the usual 3-sigma limits and works For example, analysis of several batches in the same (calibration)
well. To match the approach of a Shewhart chart using the run would violate this assumption. Regarding the range or
Western Electric rules, Stuart Hunter42 recommends a value of λ standard deviation control charts, it should be noticed that the
= 0.4. distributions of the range and standard deviation are asymmetric
In general, the EWMA is sensitive to small shifts but, similarly around their mean. Nevertheless, for ease of construction,
to the cusum, is not as sensitive as the Shewhart control chart for asymmetric limits can be used and a lower control limit of 0
larger shifts. One solution to address these limitations is to use established when the calculated lower limit is a negative value.
both a Shewhart chart and a cusum or EWMA chart to ensure In SAPPMC, for a better interpretation of the control chart, it
that both small and large shifts are detected quickly. is always valuable to associate several rules and use multirules
When confronted with non-normal data such as from tools. An obvious example is the case of ISO 7870-2 rule 1 (one
enumeration plates, the charts discussed above that assume measurement outside the +3s or −3s zone):
normality (particularly the individual charts) will give false
signals and lead to unnecessary investigations. In these cases, the • If ISO 7870-2 rules 1 and 3 are met at the same time, this
above control charts may still be used; however, the data must could mean that the out-of-control measurement is a
first be normalized, for example, using a box-cox transformation consequence of the observed trend in the measurements
for non-negative skewed data. and an investigation for an assignable cause to this trend
In addition to the control charts described above, there are should be conducted.
charts for other data types such as attributes data (conforming vs • If ISO 7870-2 rule 1 is met at the same time as rule 4, 5, or
nonconforming) and count data that are less common in the 8, this could mean that the out-of-control measurement is
974 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

Table 5. Examples of Western Electric - ISO 7870-2 Rules for Control Chart Results Interpretation

a consequence of the observed change in the analytical persion might probably explain unexpected behaviors of the
procedure measurements dispersion, and an investigation mean values.
for an assignable cause for the increase in measurements An example template for control chart interpretation is given
dispersion should be conducted. in Figure 3, in the case of Shewhart control charts applied to
• If no other rule is met at the same time, this could mean SAPPMC data (i.e., system suitability or quality control check
that an error has been made in the analytical procedure sample data).
application, and an investigation for an assignable cause Estimation of Long-Term Precision. Besides graphical
should be conducted for this result. presentation and monitoring of analytical performance attrib-
Therefore, an interpretation pattern may be proposed for utes and parameters, these data can be used to calculate average
interpretation of the SAPPMC control charts. If a dispersion of parameters. In the (the usual) case of replication parameters, the
measurements control chart, either an S or R control chart, is respective precision levels are addressed. For some examples, see
available together with a mean control chart, it should be Table 6. Due to the large number of data that become available
examined first to identify unusual trends or patterns in the over time, these parameters are statistically very reliable and an
measurements dispersion, as changes in measurements dis- excellent estimation of the true (population) parameters.
975 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

observations which might fall well above or below the


performance requirement.
• Acceptable common cause: the procedure is considered fit-
for-purpose when all observations are within the predefined
upper and lower established limits. Usually, common cause
variation does not necessarily trigger an action and is acceptable
(no special cause variation exists).
• Unacceptable common cause variation: occurs when few
observations fall outside the performance requirement (e.g., a
noise variable has been found to have an impact on routine
performance and not during procedure development stage and
lack of risk management and proper investigation of variation
origin affecting performance).
During the investigation, particular attention should be given
to special cause variation (shifts, drift, and deviation) and
Figure 3. Control chart interpretation template, based on ISO 7870-2 unacceptable common cause variation (unacceptable noise).
(Western Electric) rules. “Yes” indicates that the considered rule is met. The violation of alert limits and rules may not necessarily require
immediate actions, and this should be justified in the monitoring
plan. In the case of violation of “hard” limits/rules, an
However, results impacted by special cause variation (proven or investigation must be initiated to identify the root cause. If a
made likely through an investigation) should not be included in trend alert has no confirmed correlation with procedure
the calculation. As long as no changes occur that impact the performance, and there are no simultaneous alerts for critical
respective parameter, the calculation should proceed in an procedure performance attributes, or out of trend (OOT) batch
ongoing manner, increasing the reliability. Often, the average data in the trending program, then the most pragmatic course of
parameters are also used to establish limits for monitoring or action may be to check for any worsening trends at the next
control charts. In these cases, “overadjustment” should be scheduled routine monitoring check (e.g., if an OOT peak
avoided, i.e. the limits should only be adjusted in case of relevant capacity is observed during routine monitoring of a chromato-
changes. graphic method, but the resolution of the critical pair of peaks is
An example of the estimation of long-term precision is on trend, then there may not be an immediate benefit in formal
provided in the Supporting Information. investigation at this time, but it would be advisible to monitor

■ PERFORMANCE ASSESSMENT AND CONTINUOUS


IMPROVEMENT OF ANALYTICAL PROCEDURES
both peak capacity and critical pair resolution more closely at
subsequent scheduled monitoring checks). In contrast, if there is
a confirmed correlation with procedure performance or an
A monitoring program will often include parameters or observed impact on batch data, then a formal OOT investigation
attributes that only have the potential to impact procedure should be conducted, in line with GMP.
performance (in addition to those parameters with a confirmed As a result of the overall performance assessment, suspect but
link to performance) and should clearly outline alert limits/rules noncritical behavior or shifts in the long term parameter may be
and hard action limits (control limits). It is important to note further investigated, and the monitoring plan and control
that not all trend alerts require formal investigation, deviation, strategy may be updated to promote continuous improvement
and corrective/preventative actions. Context is very important of procedure performance. Small changes in control limits
when deciding on the appropriate action to take in response to should be avoided (“over-adjustment”). Procedure performance
an alert. During procedure routine use, different types of improvement could be relatively small like improvements to
variation may occur:45 training plans or the wording in method documentation to
• Special cause variation (e.g., new and/or unexpected improve clarity and consistency of procedure operation. It could
variation): these can be due to signal shifts, drift, and relate to instrument or reference standard management or

Table 6. Examples Show Average Precisions


attributes/parameters type of sample/data precision level (variances)
range, RSD from replicate measurements (same sample preparation measurement (injection precision/variance)
preparation)a,c reference standard preparation
range, RSD from replicates (independent preparations)a,b sample preparation repeatability (sample/RS preparation variance)
reference standard preparation
relative root-mean-square error of regression RMSEa,c calibration solutions variability of calibration
single results range, RSD from replicatesa,b,d control samplef intermediate precision/reproducibility, (between-series
variance)
result control samplef precision (variance of the reportable value)
range, RSD from replicates within storage intervalsa,b stability data (sample, reference injection precision or repeatability
standard)
all storage intervalse stability data (sample) intermediate precision/reproducibility

a
Calculation from duplicate range or from variance (n > 2). bVariance component analysis. cIt may be instrument-dependent. dIntermediate
precision, reproducibility constitute the precision of a single reportable value. eCalculation from RMSE of linear regression. See ref 34. fSame
replication strategy as used for sample preparation.

976 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

Figure 4. Analytical procedure performance monitoring process and opportunities for improvement.

sometimes more fundamental procedure changes which require


regulatory postapproval change, e.g., changes to critical
analytical procedure parameters or analytical technology.
Opportunities for improvement can be flagged (Figure 4)
during the periodic review (e.g., during scheduled revisit of the
analytical procedure risk assessment and PPC&ER) or in
response to a procedure performance trend alert during routine
performance monitoring (e.g., using SAPPMC).
Below are some real examples of performance monitoring in
action. In all cases, the benefits of proactive monitoring and
improvement action have been shown to result in increased
efficiency, reduced QC downtime, and ultimately more robust
supply chains. Other examples were recently published.22
Example - HPLC Assay Procedure Performance
Monitoring (Reference Standard Monitoring). An out of
trend (OOT) deviation event resulted in significant delays for
release of drug product batches and wasted QC time and
resources. PPC&ER determined the root cause to be a problem
with the new reference standard batch, which is out of trend by
comparison with previous batches in Figure 5A. A calibration
standard peak area response has now been added to the routine
procedure performance monitoring plan for this assay in order
to provide an early detection mechanism for any potential future
challenges regarding this unusually complex reference standard.
Control charts with preset control limits will allow detection of
early signals of drift, enabling reduced rates of initial OOT/OOS
due to procedure error through proactive action.
Example - UV Assay Procedure Performance Monitor-
ing. Multiple OOT results were generated when new analysts
began performing the UV assay test. This resulted in delayed test
results and wasted QC time for deviations and CAPAs that did
Figure 5. Procedure performance monitoring: (A) HPLC assay and
not address the root cause. Proactive monitoring of the percent (B) UV assay.
nominal response for a quality control sample now provides
great insight into test occasion to test occasion procedure
variability. Trend rule violations (e.g., data points outside of 3SD supplementary training (useful for methods such as this one with
from the mean (red line)) are a signal that a pattern in the data complex sample preparation steps).
may be unusual and should be looked at more closely (Figure
5B). Filtering using metadata, e.g., color-coding by analyst,
reagent batch, instrument, etc., can help us to understand
■ CONCLUSION AND OUTLOOK
Ongoing procedure performance verification provides data and
potential root cause. Trending by analyst can identify a need for information about the procedure performance over time. The
977 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

risk category of analytical procedures, usually established based Authors


on their complexity and intended purpose, can drive the Phil J. Borman − US Pharmacopeia, Rockville, Maryland
selection of appropriate analytical procedure performance 20851, United States; orcid.org/0000-0002-6742-5826
indicators and their assessment frequency. The extent of the Jane Weitzel − US Pharmacopeia, Rockville, Maryland 20851,
routine monitoring required should be defined based on risk United States
assessment, where the performance of a procedure performance Sarah Thompson − AstraZeneca, Macclesfield, Cheshire SK10
cause and effect review (PPC&ER) provides a useful way to 2NA, United Kingdom
identify the most value adding procedure attributes and/or Joachim Ermer − US Pharmacopeia, Rockville, Maryland
performance parameters. The application of traditional 20851, United States
SAPPMC in the verification of procedure performance on a Jean-Marc Roussel − US Pharmacopeia, Rockville, Maryland
continuous basis can be very useful tools to help identify adverse 20851, United States
trends and detect shifts that may affect analytical procedure Jaime Marach − US Pharmacopeia, Rockville, Maryland
performance so that corrective and preventative actions (action 20851, United States; orcid.org/0009-0000-0574-084X
plan) can be implemented to ensure that the procedure Stephanie Sproule − US Pharmacopeia, Rockville, Maryland
continues to be fit for use. It is worth noting that not all trend 20851, United States
alerts require formal investigation or deviation and corrective/ Horacio N. Pappa − US Pharmacopeia, Rockville, Maryland
preventative actions, the context is very important when 20851, United States
deciding on the appropriate action to take in response to an Complete contact information is available at:
alert. Among the SAPPMC tools available, control charts are https://pubs.acs.org/10.1021/acs.analchem.3c03708
important and powerful for ongoing monitoring of critical
procedure performance attributes/parameters, which can also Notes
help highlight opportunities for proactive improvement of
The authors declare no competing financial interest.
procedure performance over time. The ATP can be used as an
overarching element which ties together all stages and activities
that take place within the APLC. The ATP requirements relate
to the reportable value and can therefore usually not be verified
■ ACKNOWLEDGMENTS
This publication was supported by the US Pharmacopeia
directly, being a challenge for ATP criteria verification during Measurement and Data Quality Expert Committee. The content
routine analysis. This paper shows that ATP performance is solely the responsibility of the authors and does not necessarily
criteria can be verified directly (e.g., trending precision and represent the official views of the United States Pharmacopeia.
accuracy of the results using control sample obtained using the
same replication strategy as for the sample) or indirectly through
ensuring key analytical procedure attributes linked to the ATP
■ REFERENCES
(1) Jackson, P.; Borman, P.; Campa, C.; Chatfield, M.; Godfrey, M.;
are within appropriate limits. Different document types and Hamilton, P.; Hoyer, W.; Norelli, F.; Orr, R.; Schofield, T. Anal. Chem.
programs in PQMS may also be leveraged to guide organizations 2019, 91, 2577−2585.
on how to design, execute, and document analytical procedure (2) Borman, P.; Campa, C.; Delpierre, G.; Hook, E.; Jackson, P.;
lifecycle management, through policies, SOPs, job aids, and Kelley, W.; Protz, M.; Vandeputte, O. Anal. Chem. 2022, 94 (2), 559−
training programs. In summary, the knowledge acquired during 570.
stage 3 can be used to react to changes in procedure (3) Barnett, K.; Chestnu, S.; Clayton, N.; Cohen, M.; Ensing, J.; Graul,
performance, to plan and implement changes, to improve the T.; Hanna-brown, M.; Harrington, B.; Morgado, J. Defining the
procedure, to communicate to regulatory authorities, and to Analytical Target Profile. Pharm. Eng. 2018. https://ispe.org/
pharmaceutical-engineering/march-april-2018/defining-analytical-
help identify the need for the ACS review, i.e., procedure target-profile (accessed November 7, 2023).
performance verification is part of the overall procedure risk (4) Schweitzer, M.; Pohl, M.; Hanna-Brown, M.; Nethercote, P.;
management process. Borman, P.; Hansen, G.; Smith, K.; Larew, J. Pharm. Technol. 2010, 34

■ ASSOCIATED CONTENT
* Supporting Information

(2), 52−59.
(5) USP General Chapter <1220> Analytical Procedure Lifecycle,
2022. https://online.uspnf.com/uspnf/document/1_GUID-
35D7E47E-65E5-49B7-B4CC-4D96FA230821_2_en-US?source=
The Supporting Information is available free of charge at Search%20Results&highlight=1220 (accessed November 10, 2023).
https://pubs.acs.org/doi/10.1021/acs.analchem.3c03708. (6) Barnett, K. L.; Mcgregor, P. L.; Martin, G. P.; Leblond, D. J.;
Weitzel, M. L. J.; Ermer, J.; Walfish, S.; Nethercote, P.; Gratzl, G. S.;
Factors for constructing Shewhart control charts (X, X- Kovacs, E. Analytical Target Profile: Structure and Application
bar, R, and S charts) for continuous variables (PDF) Throughout the Analytical Lifecycle. Pharm. Forum. 2016, 42 (5).
(7) Rignall, A.; Borman, P.; Hanna-Brown, M.; Grosche, O.;
Example of estimation of long-term precision for an Hamilton, P.; Gervais, A.; Katzenbach, S.; Wypych, J.; Hoffmann, J.;
HPLC-UV compendial assay procedure without initial Ermer, J.; McLaughlin, K.; Uhlich, T.; Finkler, C.; Liebelt, K. Pharm.
ATP (PDF) Technol. 2018, 42 (12), 18−23.
(8) Guiraldelli, A. M.; Lourenço, F. R.; Borman, P.; Weitzel, J.;

■ AUTHOR INFORMATION
Corresponding Author
Roussel, J. M. Analytical Target Profile (ATP) and Method Operable
Design Region (MODR). In Introduction to Quality by Design in
Pharmaceutical Manufacturing and Analytical Development; Breitkreitz,
M. C., Goicoechea, H. C., Eds.; Springer, 2023; pp 199−219.
Amanda M. Guiraldelli − US Pharmacopeia, Rockville, (9) Guiraldelli, A. M.; Lourenço, F. R.; Borman, P.; Weitzel, J.;
Maryland 20851, United States; orcid.org/0000-0003- Roussel, J. M. Analytical Quality by Design Fundamentals and
4857-716X; Phone: +491755223723; Compendial and Regulatory Perspectives. In Introduction to Quality
Email: [email protected] by Design in Pharmaceutical Manufacturing and Analytical Development;

978 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979
Analytical Chemistry pubs.acs.org/ac Perspective

Breitkreitz, M. C., Goicoechea, H. C., Eds.; Springer, 2023; pp 163− (31) Borman, P. J.; Chatfield, M. J.; Damjanov, I.; Jackson, P. Anal.
198. Chem. 2009, 81, 9849−9857.
(10) ICH Guideline Q2(R1) Validation of Analytical Procedures: (32) Burdick, R. K.; Ermer, J. J. Pharm. Biomed. Anal. 2019, 162, 149−
Text and Methodology. 2005. https://database.ich.org/sites/default/ 157.
files/Q2%28R1%29%20Guideline.pdf (accessed November 10, 2023). (33) Ermer, J.; Ploss, H. J. J. Pharm. Biomed. Anal. 2005, 37 (5), 859−
(11) USP General Chapter <1225> Validation of Compendial 870.
Procedures. https://online.uspnf.com/uspnf/search?facets= (34) Ermer, J.; Arth, C.; de Raeve, P.; Dill, D.; Friedel, H. D.; Höwer-
%5B%5B%22document-status_ Fritzen, H.; Kleinschmidt, G.; Köller, G.; Köppel, H.; Kramer, M.;
s % 2 2 % 2 C % 5 B % 2 2 O ffi c i a l % 2 2 % 2 C % 2 2 T o % 2 0 B e % 2 0 O ffi Maegerlein, M.; Schepers, U.; Wätzig, H. J. Pharm. Biomed. Anal. 2005,
cial%22%2C%22Commenting%20open%22%5D%5D%5D&query= 38 (4), 653−663.
1225 (accessed November 10, 2023). (35) USP General Chapter <1210> Statistical Tools for Procedure
(12) Nethercote, P.; Borman, P.; Bennett, T.; Martin, G.; McGregor, Validation. 2018. https://online.uspnf.com/uspnf/document/1_
P. QbD for Better Method Validation and Transfer. Pharma GUID-13ED4BEB-4086-43B5-A7D7-994A02AF25C8_7_en-
Manufacturing 2010. https://www.pharmamanufacturing.com/ US?source=Search%20Results&highlight=1210 (accessed November
development/qbd/article/11360549/qbd-for-better-method- 7, 2023).
validation-and-transfer (accessed November 7, 2023). (36) Feinberg, M.; Boulanger, B.; Dewé, W.; Hubert, P. Anal. Bioanal.
(13) Guiraldelli, A.; Weitzel, J. Evolution of Analytical Procedure Chem. 2004, 380, 502−514.
(37) Wernimont, G. Ind. Eng. Chem. 1946, 18 (10), 587−592.
Validation Concepts: Part I − Analytical Procedure Life Cycle and
(38) Howarth, R. Analyst 1995, 120, 1851−1853.
Compendial Approaches. Pharm. Technol. 2022. https://www.
(39) Montgomery, D. C. Introduction to Statistical Quality Control, 8th
pharmtech.com/view/evolution-of-analytical-procedure-validation-
ed.; Wiley, 2019.
concepts-part-i-analytical-procedure-life-cycle-and-compendial- (40) International Standard. ISO 7870-2. Control Charts - Part 2:
approaches (accessed November 7, 2023). Shewhart Control Charts, 2nd ed.; International Organization for
(14) FDA. Guidance for Industry. Process Validation: General Standardization, 2023.
Principles and Practices. 2011. https://www.fda.gov/files/drugs/ (41) Kwak, S.; Kim, J. Korean J. Anesthesiol. 2017, 70, 144−156.
published/Process-Validation--General-Principles-and-Practices.pdf (42) Stuart Hunter, J. Qual Eng. 1989, 2 (1), 13−19.
(accessed November 7, 2023). (43) Westgard, J. O.; Barry, P. L.; Hunt, M. R.; Groth, T. Clinical
(15) ICH Guideline Q8(R2) Pharmaceutical Development. 2009. Chemistry 1981, 27, 493−501.
h t t p s : / / d a t a b a s e . i c h . o r g / s i t e s / d e f a u l t / fi l e s / (44) Levey, S.; Jennings, E. R. J. Clin. Pathol. 1950, 20 (11), 1059−
Q8%28R2%29%20Guideline.pdf (accessed November 7, 2023). 1106.
(16) ICH Guideline Q9 Quality Risk Management. 2005. https:// (45) Martin, G. P.; Barnett, K. L.; Burgess, C.; Curry, P. D.; Ermer, J.;
database.ich.org/sites/default/files/ICH_Q9%28R1%29_Guideline_ Gratzl, G. S.; Hammond, J. P.; Herrmann, J.; Kovacs, E.; LeBlond, D. J.;
Step4_2023_0126_0.pdf (accessed November 7, 2023). LoBrutto, R.; McCasland-Keller, A. K.; McGregor, P. L.; Nethercote,
(17) ICH Guideline Q10 Pharmaceutical Quality System. 2008. P.; Templeton, A. C.; Thomas, D. P.; Weitzel, J. Lifecycle Management
https://database.ich.org/sites/default/files/Q10%20Guideline.pdf of Analytical Procedures: Method Development, Procedure Perform-
(accessed November 7, 2023). ance Qualification, and Procedure Performance Verification. Pharm.
(18) ICH Guideline Q11 Development and Manufacture of Drug Forum 2013, 39 (5).
Substances (Chemical Entities and Biotechnological/Biological
Entities). 2012. https://database.ich.org/sites/default/files/
Q11%20Guideline.pdf (accessed November 7, 2023).
(19) ICH Guideline Q12 Technical and Regulatory Considerations
for Pharmaceutical Product Lifecycle Management. 2019. https://
database.ich.org/sites/default/files/Q12_Guideline_Step4_2019_
1119.pdf (accessed November 7, 2023).
(20) ICH Draft Guideline Q13 Continuous Manufacturing of Drug
Substance and Drug Products. 2021. https://database.ich.org/sites/
default/files/ICH_Q13_Step4_Guideline_2022_1116.pdf (accessed
November 7, 2023).
(21) ICH Draft Guideline Q14 Analytical Procedure Development.
2022. https://database.ich.org/sites/default/files/ICH_Q14_
Document_Step2_Guideline_2022_0324.pdf (accessed November 7,
2023).
(22) Borman, P.; Guiraldelli, A.; Weitzel, J.; Thompson, S.; Ermer, J.;
Sproule, S.; Roussel, J.-M.; Marach, J.; Pappa, H. Pharm. Technol. 2023,
40−44.
(23) Ermer, J.; Aguiar, D.; Boden, A.; Ding, B.; Obeng, D.; Rose, M.;
Vokrot, J. J. Pharm. Biomed. Anal. 2020, 181, No. 113051.
(24) Chatfield, M. J.; Borman, P. J. Anal. Chem. 2009, 81, 9841−9848.
(25) Vukovinsky, K.; Watson, T. J.; Ide, N. D.; Wang, K.; Dirat, O.;
Subashi, A. K.; Thomson, N. M. Pharm. Technol. 2016, 40 (3), 34−44.
(26) Borman, P. J.; Chatfield, M. J.; Damjanov, I.; Jackson, P. Anal.
Chim. Acta 2011, 703, 101−113.
(27) Borman, P.; Schofield, T.; Lansky, D. Pharm. Technol. 2021, 45
(4), 48−56.
(28) Ermer, J.; Agut, C. J. Chromatogr. A 2014, 1353, 71−77.
(29) Ishikawa, K. What Is Total Quality Control? The Japanese Way;
Prentice-Hall: Englewood Cliffs, 1985.
(30) Wheeler, D. J.; Chambers, D. S. Understanding Statistical Process
Control, 3rd ed.; Statistical Process Controls, 2011.

979 https://doi.org/10.1021/acs.analchem.3c03708
Anal. Chem. 2024, 96, 966−979

You might also like