OOT, OOS and Deviations: ICH Guidelines and Improvement Questions
OOT, OOS and Deviations: ICH Guidelines and Improvement Questions
OOT, OOS and Deviations: ICH Guidelines and Improvement Questions
Addendum:
Hello everyone,
Thanks,
Evald PRESENTATION ©2014, DO NOT REPRODUCE
OOT (Out-of-Trend) Evaluation
• What is an OOT result?
– A result that does not follow the expected trend,
either in comparison with other stability batches
or with respect to previous results collected
during a stability study.
– More complicated than a comparison to
specification limits.
– Procedures to identify OOT depend on available
data that define the norm.
3
OOT Results Modus Operandi
• OOT results do not follow the expected trend.
– They are not necessarily Out-of-Specification.
• OOT results are somewhat of a rogue topic.
– United States vs. Barr Laboratories (1993) forced
an FDA draft guidance on OOS.
– This guidance footnotes that similar guidance can
be used to examine OOT results.
– But there is no clear legal or regulatory basis to
require consideration of data within specification
but not following expected trends.
4
OOT identification as a necessity
• Common sense:
– OOT analysis could predict the likelihood of future
OOS results.
• The Nature of Stability Data
– Stability data is a routine regulatory submission.
– Stability data can set internal release limits.
– Stability data estimates product change to expiry.
• OOT is crucial to both regulatory and business.
5
Types of OOT Identification
• Qualitative
– Graphical
• Quantitative
– Statistical
6
Graphical OOT Evaluation
110
100
Result
(units)
90
80
Lot 4
70
60
0 3 6 9 12 15 18 21 24 27 30 33 36
Time
(months)
7
Graphical OOT Evaluation
Lot #4 through Lot #7 – Results vs. Time
120
110
100
Result
(units)
90
Lot 4
80 Lot 5
Lot 6
70 Lot 7
60
0 3 6 9 12 15 18 21 24 27 30 33 36
Time
(months)
8
Statistical Evaluation of OOT
• Pros
– Data variability
– Assay specific
• Three methods:
– Regression Control Chart
– By Time Point
– Slope Control Chart
10
Regression Control Chart
• Uses & Assumptions
– Within a batch or between batches
– Normal and independent distribution of data
– Constant variability across time points
– Common linear slope for all batches
12
Regression Control Chart
Time Point (x) Result (y) XY XX
0 98 0 0
3 104 312 9
6 90 540 36
9 98 882 81
12 97 1164 144
18 100 1800 324
24 98 2352 576
36 97 3492 1296
13
Regression Control Chart – expected result ± (k × s)
Results (units) vs. Time (months) in Regression Control Chart with Regression Control Limits
110
100
Result
(units)
90
Lot 4
UTL
80
LTL
Linear Regression
70
0 3 6 9 12 15 18 21 24 27 30 33 36
Time
(months)
14
By-Time-Point Chart
• Compares based on historical batches
• Assumes
– normal distribution
– all observations at a time point are independent
• Advantages
– level of confidence can be tailored to product
– no assumptions about the shape of the degradation curve
• Challenges
– if current data aren’t tested at nominal time points
• Tolerance interval is computed per time point using historic
data
• Calculate: mean (x̄) and standard deviation (s)
interval = x̄ ± ks 15
By-Time-Point Chart - interval = x̄ ± ks
Results (units) vs. Time (months)
110
100
Result
(units)
90
Lot 4
80
0 3 6 9 12 15 18
Time
(months)
16
OOT in Degradants and Impurities
• Batches measured for degradation product and
impurities.
– percent area or percent
• Useful knowledge
– shape of trend
– distribution of results
• Differences with regular assaying
– linearity
– constant variance of results
– What often happens to degradant level as variability
with time increases?
17
OOT in Degradants and Impurities
• Linearity and variance may not hold for
degradants and impurities
– consider the relationship between variability with
time and % degradant (both increase).
• Limit of Quantitation (LOQ)
– no number below LOQ
– reported: < LOQ [ICH: < RT (Reporting Threshold)]
• What is the result of truncating data?
– on variability?
– on valuable statistical information?
18
OOT in Degradants and Impurities
• Example: a new peak forms
– should it exist?
– is it OOT?
– it is a new data point.
• Two options
– comparison to previous values from the batch
– comparison to previous values from other batches
19
OOT in Degradants and Impurities
• Comparison to previous values from batch
– degradation/impurity all above LOQ
– linear relationship
– assuming normality
• What if one or none of these criteria don’t stand?
– Then identifying OOT results from the batch’s data
isn’t recommended.
– Why?
• OOT = deviation from the expected
• if T=0M, 3M and 6M are below LOQ and 9M is above LOQ
then a possible underlying trend between 0 and 9 can’t be
outlined by analyzing only the batch in question.
20
OOT in Degradants and Impurities
• Comparison of new value to values from other
batches.
• Three options:
– All values are above LOQ
– All values are below LOQ
– Portions of the data are below LOQ
21
OOT in Degradants and Impurities
• All values are above LOQ
1
0.75 Normal
– By-time-point method
– Interval-normality assumed
0.5
• Skewed distribution:
0.25
– Plot (x,y)
0
-1 2 5 8 11 14 17 20
– Plot (log(x),log(y)) -2 -1 0 1 2
0
– Analyze transformation
-0.1
Log
-0.2
-0.4
-0.6
22 -0.8
OOT in Degradants and Impurities
• All values are below LOQ
– Use LOQ value.
• any result above LOQ is an OOT result
• requires sufficient amount of data
23
Implementation Challenges
• ID during stability is more difficult than ID
during release.
– Stability studies are less frequent
– Batch release is one point / stability results change
– Experience with product is required
– Contract vs. in-house evaluation
– Computer systems treat data per time-point
– No set definition of OOT prior to analysis
24
Implementation Challenges
• Definitions:
– a result is OOT if it is at odds with previous test
results for that batch
– a results is OOT if it is at odds with previous test
results from other batches at that time-point
28
21 CFR 211.192
• All OOS incidents must be investigated.
• Phases
– Laboratory Investigation
– Full-scale Investigation
• Responsible individuals:
– Analysts
– Supervisors
– QA
29
21 CFR 211.192
• Significant aspects of an investigation
– Prompt
– Impartial
– Well-documented
– Well-founded scientifically
– 30 day turn-around
• OOS Procedure for Stability Studies is necessary
– Multitude of reasons for OOS (measurement, MFG)
– Multiple outcomes (batch failure, batch rejection)
30
Phase 1 – Laboratory Investigation
• Laboratory Errors
– Analyst
• Calibrated Equipment
• System Suitability
• Specifications
• Unexpected Results
• Stopping Erroneous Testing
– Supervisor
• Timely assessment
• Confirmation of Procedural Compliance
• Examination of Raw Data
• Confirmation of Instrument-performance
• Examination of Solutions, Reagents & Standards
• Evaluation of performance of test method
• Documentation of Assessment
31
Laboratory Investigation Checklist
Product Name: ___________ Issued by/date: _________
Stability Study #: ___________ Lot #: ________ Sample #: _________
Method: ___________ Method #: ________ Test Date: _________
Equipment IDs: ___________ Analyst: ________ Suitability: yes / no
Sample
Sample ID and Condition Satisfactory? y/n
Packaging Satisfactory? y/n
Reagent
Correct Reagent Used? y/n
Within expiry date? y/n
Glassware
Correct glassware used? y/n
Solvent washed/dried Glassware used? y/n
Correct volume/volumetric ware used? y/n
Equipment
Equipment qualified for intend purpose? y/n
Equipment within calibration period? y/n
Equipment Setting Appropriate? y/n
Column
Correct column used per chromatography method? y/n
Column wash steps completed prior to injection? y/n
Analyst Training
Analyst trained on use of equipment? y/n
Analyst trained on analytical method? y/n
SOP Steps
Weights in correct range? y/n
Dilutions performed per analytical method? y/n
All steps performed as analytical method? y/n
Calculations
Software Qualified? y/n Huynh-Ba, Kim, and N. Subbarao. "Evaluation of Stability Data."
All calculations checked and found to be correct? y/n 32
Handbook of Stability Testing in Pharmaceutical Development:
Regulations, Methodologies, and Best Practices 2008. 271.
Lab Investigation Results
• Aberrant result is due to laboratory error.
– Invalidation of result
– Retain investigation and attach raw data
• Frequency
– Relatively rare
– Frequency indicates lack of control over lab/people
– Once source of error is identified, appropriate
Corrective And Preventative Actions (CAPA) ensues.
33
Phase 2 – Full Scale OOS Investigation
• When Laboratory Investigation does not identify
root cause, initiate full-scale
– Involve functional groups
– Promptly initiate & complete
• Critical Parts
– Clear statement for reason of investigation
– Summary of aspects of potentially problematic mfg.
process
– Results of a documentation review and review of
historical data
– A description of corrective actions taken
34
Retesting vs. Resampling
• Retesting
– Investigation may involve retesting of original sample
• Should be original and homogenous material.
• Should be approved by Quality Unit
• Should be based on sound scientific judgment
• Should be performed by another analyst if possible
• Should be fully proceduralized (§ 211.160) and definite.
• Should not test into compliance
• In case of a defined laboratory error, the retest results would
substitute the original test result
• In case of no defined laboratory error, the retest results
would not substitute the original result and both would be
recorded and considered in batch release decisions.
35
Retesting vs. Resampling
• Resampling
– In studies for which the original sample cannot be
retested/resampled e.g. due to passage of time
• Pull new sample and designate as such e.g. T=6M is
OOS, then designate and pull T=7M (§ 211.165(c))
– If faced with insufficient samples to test OOS
• Consider obtaining samples from other programs e.g.
retention programs.
– Not recommended given variability of storage conditions
– Also must initiate new, accurate sampling method (§§ 211.160
and 211.165(c))
36
Statistical Trickery
• Outlier Testing (§ 211.165(d))
“arbitrary rejection or retention of an apparently aberrant
response can be a serious source of bias... the rejection of
observations solely on the basis of their relative magnitudes is a
procedure to be used sparingly” (USP <111>)
– Must be proceduralized (type, parameters, mechanics)
– May be appropriate for biological assays that have a high
variability.
– Is inappropriate for validated chemical methods with relatively
small variance, and cannot invalidate result.
• A result found to be a discordant outlier can be used as supporting
evidence with other data to evaluate significance.
– Is not applicable in cases of assaying variability e.g. uniformity,
dissolution, release rate determination.
• Outlying result can be valid non-uniform product.
37
Statistical Trickery
• e.g. Outlier Test Time Result
Z p
Point (x) (y)
0 98 0.58 > 0.05
3 104 1.13 > 0.05
6 90 1.31 > 0.05
9 98 0.09 > 0.05
100
90
(units)
Result
0 3 6 9
graphical analysis. Solidify OOT definition. Time
(months) 38
Statistical Compliance
• USP 24 (2000, 1837)
– provides outlier guidance
– EP allows situational response
• Reserve for bioassays
– e.g. petri dish assay
– Perform test and use G values in USP 24 to define significance and
establish statistical basis for omitting an outlier.
N 3 4 5 6 7
G1 .976 .846 .729 .644 .586
N 8 9 10 11 12 13
G2 .780 .725 .678 .638 .605 .578
N 14 15 16 17 18 19 20 21 22 23 24
G3 .602 .579 .559 .542 .527 .514 .502 .491 .481 .472 .46439
Statistical Compliance
• Data imputation (tread carefully)
– Mathematician vs. Mathemagician
– Reference USP 24 (2000, 1838), EP (2000, 270)
– Substituted value is there to aid calculation, not to add
information to existing data.
– Subtract degree of freedom for each substitution.
Where:
– f = number of sets (assay plates)
– k = number of treatments
– Tr’ = the incomplete total from the plate having a missing value
– Tt’ = the incomplete total from the treatment having missing value
– T’ = the incomplete totals from the assay as a whole 40
Confirmed OOS
• If full-scale investigation does not identify a laboratory
error then OOS is confirmed and is representative of
the lot.
• Commercial Lots
– Lots subject to regulatory application
– Submit Field Alert Report (FAR) within 3 days
• Development Lots
– Registration Studies
– Products intended for long-term storage at RT
– Review ICH Q1A(R2)
– Upon significant change, begin testing at intermediate
condition.
– Failure at accelerated condition may trigger label-changes.
41
Confirmed OOS
• OOS at Long-Term Storage
– Can trigger changes to packaging, formulation,
storage condition and proposed shelf-life.
– Removal of lot from ongoing clinical study is up to
Quality Unit.
• Trending OOS Results
– Considered to be best practice (proceduralized)
– Benefits: identify process improvements in lab, grow
database of elements (product, temp, method, cause).
– Aids periodical queries and Pareto Charts for
continuous improvement.
42
Non-reportable OOS or Atypical Result
(Development, intermediates, special conditions)
Special Conditions
Product at proposed condition and within shelf-life.
(Accelerated, Photostability, Excursion, Expired)
No Yes
First Occurrence of
failure? Yes Initiate OOS First Occurrence of failure?
Investigation Doc
Yes No
No
Yes Are Retest Samples
Available? No-retesting