P - University of Arizona

advertisement
From GLP to CLIA’88 (Clinical Laboratory
Improvement Act 1988)
Method Verification and Validation
Donna M. Wolk, MHA, Ph.D., D(ABMM)
Southern Arizona VA Health Care System
University of Arizona
2/1/2006
Some images adapted from:
Wolk, D., S. Mitchell, and R. Patel. 2001. Principles of molecular
microbiology testing methods. Infect.Dis.Clin.North Am. 15:1157-1204.
Objectives
• Discuss gap between Research/GLP lab/ and
Clinical (medical) lab testing (i.e., translational
laboratory testing)
• Overview CLIA’88 requirements for method
verification and validation in clinical laboratories
• Discuss cursory statistics, used to avoid method
pitfalls
• Give examples method verification and
validation issues

KEY Deficiencies
Cited in many translational labs

Assay Verification & Validation

In the Journey from Research Laboratory to
Diagnostic Laboratory……
Standard
Practices
Research
State of the Art
“Proof of Concept”
Scientific Principle
Quality
Systems
Diagnostic
Laboratory
Operations
Proven
outcomes &
Indicators
Research
State of the Art
“Proof of Concept”
Scientific Principle
Research
Regulations &
Agencies
• Good Laboratory Practice (GLP) 21 CFR
PART 58 - Food and Drug Administration,
9/4/87 ,144 requirements
• FDA and EPA perform lab inspection under
the Freedom of Information Act
Scientific/Operational Responsibilities
Rest with Bioindustry and Clinical Labs
Experimental Design/Technology Transfer

Diagnostic
Laboratory
Operations
• GOVERNED BY:
•
•
•
•
•
•
•
FDA
CLIA ’88
NCCLS, now CLSI
CAP
State Inspections
VA Regional Commission
New: ISO for Healthcare
The FDA and Clinical Laboratories
• FDA-Cleared
• Method Performance & Reproducibility
• Clinical Relevance & Utility (clinical outcomes)
• ASR
• Method Performance & Reproducibility
• RUO
• No FDA-review
• Home Brew, “Research Use Only”
• Governed by CLIA ‘88
For RUO and Home Brews
• CLIA’88 Lab Directors are legally responsible for own lab
and reference labs even if they do NOT perform test onsite
• Beware of under-verified methods
• Ask lots of questions to reference labs
• Demand answers that help you interpret test results
• KEY Deficiencies Cited
• Impractical Method Design
• Inadequate Error Assessment
Luck of the Translational Lab
Scientific & Operational
Responsibilities for Experimental
Design and Technology Transfer
Molecular Assays: Most are
PCR-based
Amplicon Detection & Confirmation
PCR
Gel Electrophoresis (detection only)
DNA Sequencing
Probes (Southern Blot or Real-time PCR)
Example: DNA Target on Bacterial
Chromosome
DNA Strand/Target Sequence
Double Helix
DNA Strand
Chromosome
Supercoiled
Strand
Image Courtesy of Roche Diagnostics, Inc.
Traditional PCR: DENATURED DNA TARGET
3’
5’
Primer anneals to
denatured DNA
Primer extended by
Taq polymerase &
synthetic dNTPs
Synthetic strands
(amplicon)
completed and
detected
Template
Primer
Amplicon
Wolk, D., et.al. 2001. Infect.Dis.Clin.North Am. 15:1157-1204.
Example: Real-time PCR
Amplification Plot (Cycle Threshold)
Threshold
Sample w/ template
Rn
Rn
No Template
Control
0
10
20
30
cycle threshold (C.O or CT)
40
Molecular Method Verification
CLIA-Characterized Assay
Performance & Error Assessment
• Verification: Characterize and Compare Assay
Define errors and limitations. Assess risk of
causing a change in the interpretation of test
results?
•
•
•
•
Validation: Assess Performance
All the time
Over time
Under various conditions
• Prove Error resistance and Competency
Lab Director Responsibilities
• Design translational assay to facilitate
characterization and perform error & limitation
assessment…….
• Mimic reality as possible, performance
characteristics in matrix
STATISTICALLY determine/assess performance AND
limitations – they must be defined and relayed to
clients and decision makers!
• Determine extent of verification: MUST reflect
criticality of decisions made based on each assay
• Variability is inherent to any assay; the question is “How much is
too much?”
Initial Verification & Validation
Putting together the PACKET
for inspection
• Prepare verification & validation packet for
EACH new method/target or sample
•
•
•
•
SUMMARY: Who, what, when, where, how?
Statistical characterization of assay
Policies, SOPs, Document Control
Scientific, Ethical, & Operational assessments, in-lab
& throughout health care system
Verification and Validation Packet
• WHY?
• Need for Assay: PURPOSE and Rationale, Clinical
Utility, Impact to Care
• Literature Review and References
• WHAT?
• ANALYTE and Assay type: Screen, confirm, or
diagnose?
• Type of Method: science, throughput, speed, costs,
adaptable to multi-tasking workflow, practicality
• Specimen requirements
Verification and Validation Packet
• WHO?
• Personnel Issues, Responsibility Levels, Training, Safety
• Ethical Issues, Risk Assessments
• Vendor & Supply Issues: storage, shelf – life, availability,
service, tech support, contracting, etc.
• WHERE?:
• pre- and post-analytical issues
• Local issues with reference range, population
• HOW?: Needs lots of data and statistics to prove
Packet: What else?
• Establish the method is fit for its purpose? Are
methods adaptable to Healthcare/Clinical Labs?
• Clinical Utility, Criticality of Results
• Cost Analysis, Cost/benefit
• Expected Outcomes
• Implementation Specifics
• Documents: SOPS, Policies, Processes, Doc Ctrl.
• Method Performance: w/ SUMMARY/GRAPHICS,
Conclusions, Limitations, Comments
Analytic Method Performance
• Compare to “Gold Std.” (target Chi sq analysis or
approx. 50 pos/ 100 neg)
• Recovery: spiked samples
• Specimen Requirements: quality, vol., collection,
• Specimen Stability: preservation, storage
• Specimen type comparisons
• Carry-over studies and % contamination
• Bias, systematic and non-systematic
• Matrix or Interference studies
• Established reference range, if applicable
• Statistical characterization….specifics to follow….
Limitations of molecular methods
• CRITICAL TO ASSESS LIMITATIONS
• Analytical assay verification/validation
uncovers assay limitations
• Assess both Virtual and Actual
Examples of Common Limitations
• False negatives:
•
•
•
•
Limited sensitivity in real matrices
Reaction inhibitors in nature
Other matrix effects
Antigenic variation of organisms
• False positives:
• Contamination with DNA
• Limited specificity testing during verification
• Unknown organisms that are in nature
• Errors
• New reagent lots, primers/probes
• Lack of appropriate controls built into assay
STATISTICS
Determine Analytic Method Performance.
Qualitative Assay Characterization
ANALYSIS FOR EACH
Detection Method
Instrument
Specimen Type
Storage Condition or Preservative
Target
Control
Operator
Qualitative Analysis
• Accuracy compared to a Gold Standard or
accurate control, i.e., % Agreement with reference
method or spikes (Recovery Expts)
• Analytic sensitivity = Lower Detection Limit (LDL),
in reagent water AND natural matrices
 Define vs. 95% Detection Limit (DL-95%)
• Analytic specificity w/ closely related & likely
organisms i.e., % Agreement with spikes or
known positive specimens or sequences (virtual)
 Cross reactions with other organisms
Results: Single
Target
Spores/
Extraction
% Accuracy
(%Positive)
Without IC,
(n=10)
3
(DL-95%)
10/10 = 100%
10/10 = 100%
2
7/10 = 70%
10/10 = 100 %
1 (LDL)
4/10 = 40%
3/10 = 30%
AWWARF #2901
Ct values of Lab Water @
5, 3, 1 spore
Qualitative Analysis
• % Inhibitory: Inhibitors in natural matrices
 Matrix Analysis including interfering
substances
• % Contaminated: Extraction and NTC
 Use 5-10% Neg controls
• Stability: of positive controls or samples
Qualitative Analysis of Real-time PCR:
Use Ct values
• Precision/Reproducibility (Intra- and InterAssay), Characterize mean, SD, % CV, 95% CI
(@ upper, mid, and lower limits)
• On Serial Dilution Data
• Linearity, detection range
• PCR efficiency, can assess matrix effects,
extraction or assay conditions
• ANCOVA or Regression analysis on slopes
Example Precision and Efficiency
Monitor ENC-PCR (Dose Response)
MilliQ Runs
45
43
40
Run # Water Source
Ct
value
38
MilliQ 9
35
MilliQ 8
33
MilliQ 7
30
MilliQ 6
28
MilliQ 3
25
Total Population
0
500
400
30
0
20
0
10
50
40
30
20
10
5
4
3
2
1
Dose (concentration)
[Log] Dose of Microsporidia Spike
AWWARF #2901
Calculated Efficiency
• Serial dilutions
• Equation E= 10-1/Slope – 1
• Lab Water 85-94%; Source Water 50-90%;
Linearity: Detection Range
3 to > or = 5x 106 spores/reaction
AWWARF #2901
Clinical Outcomes Analysis
 Important for RISK ASSESSMENT
• Clinical Sensitivity and Specificity
Sensitivity (of a diagnostic test): the proportion of truly
diseased persons, as measured by the gold standard,
who are identified as diseased by the test under study.
• Specificity (of a diagnostic test): the proportion of truly
nondiseased persons, as measured by the gold standard,
who are so identified by the diagnostic test under study.
• Prevalence model used to calculate
PPV/NPV…..Medical Assays hi or low prevalence
2x2
Test result
Positive
Test result
Negative
Gold standard Gold standard
Positive
Negative
(condition
(condition
present)
not present)
True Positive False Positive
False
Negative
True Negative
Frequency Dependent Properties:
Sensitivity = True Positives/(True Positives + False Negatives)
Specificity = True Negatives/(False Positive + True Negative)
Frequency Dependent Properties:
Positive Predictive Value = True Positive/(True Positive + False
Positive)
Negative Predictive Value = True Negative/(True Negative + False
Negative)
http://www.med.ualberta.ca/ebm/diagcalc.htm#sspv
Other Verification Issues
• BEWARE of Improved Gold Standards
• e.g. with organism, which may have been
missed by traditional methods
• Limitations of Discrepant Analysis
• Useful statistical tools: Chi Square and 2x2
tables, t-tests, ANCOVA, regression analysis,
other descriptive stats
• Define specimen requirements: quality, volume,
collection method, etc.
• Summarize performance and define limitations
Add-ons……
• Limitations of QUANTITATIVE ASSAYS:
most are“relative” quantitation
DON’T over-interpret, see reference below
Shepley, D.P. and Wolk, D.M. 2004. Quantitative
Molecular Methods. in Molecular Microbiology:
Diagnostic Principles and Practice, ed. Persing, D.H. et.
al., ASM Press. Washington DC
Other Issues
• No culture or visual identification; Viability
is more difficult to assess than presence
VIRTUAL ISSUES: Search Sequences
Databases: Gen Bank®
• GenBank: NIH publicly available genetic sequence
database, (Nucleic Acids Research 2003 Jan
1;31(1):23-7).
• Approx. 22 million sequences Jan. 2003
• Part of the International Nucleotide Sequence Database
Collaboration
• DNA DataBank of Japan (DDBJ)
• European Molecular Biology Laboratory (EMBL),
• GenBank at NCBI (National Center for Biotechnology and
Information).
• Search nucleotide databases for sequences of
interest
FINAL PACKET REVIEW
• DOCUMENTATION of Director Review of
Data and Process, Signature, Date
• Documentation & record storage
VALIDATION
Validation: Ongoing Assessment &
Statistical Analysis
The documentation that a verified
assay performs with expected
results over a period of time
Ongoing Validation Assessment and
Statistical Analysis, QA
• Repeatability & Comparability
Quality…Control, Assurance, Systems
• Proficiency Testing / Operator Competency
• Trend Analysis over time: repeats of QC and
system to detect robustness, bias, shifts,
drifts
• New lots checks of primers, probes, etc.
• New reagents, samples, or collection devices
• Outcomes measures: expected vs. actual
• Utilization: expected vs. actual
Examples: New Transport
Intra-assay reproducibility for a known concentration
of nucleic acid, relative efficiency
Precise
Not precise or not efficient
Validation: Trend Analysis
• The documentation that a verified home-brew
test is repeatedly giving the expected results
over a period of time
• Check for biases and changes
•
•
•
•
C.O. or Ct
Tm
New lot comparisons
6 month validations
40.0
UCL=39.113
37.5
Avg=35.603
35.0
32.5
LCL=32.093
24
23
22
15
13
10
3
2
30.0
1
Mean(Ct at 100 spores/sample WSLH)
Trend Analysis: Ex. Pos. Control Organisms
100 spores/reaction
PCR Run
Descriptive Statistics and Individual Measurements of Ct values derived from 100
flow-cytometer-counted spores/sample (flow performed at WSLH) The CV is 3 –
6% for intra-assay and 11% for inter-assay variability.
AWWARF #2901
Summary: Verification/Validation
• CLIA has clear guidelines validation for home
brew (user-defined assays) and other non-FDA
cleared testing
• Test design should mimic reality as much as
possible
• ERROR ASSESSMENT !!!!!!
• Statistical Analysis
• to characterize performance (verification)
• to validate performance
• Define limitations
Questions?
dwolk@email.arizona.edu
Download