PRACTICE EVOLUTION: Decentralized Computer-Assisted IHC Image Analysis Liron Pantanowitz, MD, FCAP Director of Pathology Informatics Richard C. Friedberg, MD PhD, FCAP Chairman, Department of Pathology Baystate Health, Springfield, MA Tufts University School of Medicine Why Are We Doing This? • Practice Background • Today’s Environment Increased technological innovation Increased biological information Increased clinical demand • Convergence of two independent long term trends Key Trend #1 in the Practice of Anatomic Pathology • Evolution along Clinical Pathology lines Greater concern with analytical precision, reproducibility, accuracy, specificity, reliability Qualitative becoming quantitative “Stains” becoming “assays” Results directly tied to treatment, not just prognosis Diminishing “guild” mentality with anointed experts • Examples IHC & ELISA Her2/neu & Herceptin Key Trend #2 in the Practice of Anatomic Pathology • Evolution along Radiology/Imaging lines Analog images establish the field Market & technology forces start trend to digital imaging • Initially, scanning of analog images • Later, digitally acquired images Digitalization of images allows new applications Significant workload & throughput implications • Examples PACS Convergence imaging Windowing Dynamic images Telediagnostics Expectations • Eventually Every “image-based” pathologist will use computer-assisted analytic tools to assay specimens Intelligently designed PACS will revolutionize pathology workflow Increased reliance upon pathology Breast Cancer & Immunohistochemistry (IHC) Determining breast tumor markers (ER, PR & HER-2/neu) for prognostic & predictive purposes by IHC &/or FISH is the standard of practice. IHC score/quantification by manual microscopy is currently accepted as the traditional gold standard. Surgical Pathology workflow involves: • Pre-analytic preparation (e.g. tissue fixation & processing) • Analysis (i.e. staining of controls & patient slides) • Post-analytical component (e.g. quantification & reporting) Discrepancies between HER2 IHC & FISH mainly reflect errors in manual interpretation & not reagent limitations (Bloom & Harrington. AJCP 2004; 121:620-30). Inter- & intra-observer differences in scoring occur: • Most notably with borderline & weakly stained cases • Related to fatigue & subjectivity of human observers Accuracy is Required Accuracy = the amount by which a measured value adheres to a standard. The need for precise ER, PR & HER2/neu status in breast cancer is required to ensure appropriate therapeutic intervention. Lay press have communicated concerns over inaccuracies in breast biomarker testing. Threat of having to refer such testing to reference laboratories. Is computer assisted image analysis (CAIA) a better (i.e. more accurate & reproducible) method for scoring IHC? Guidelines ASCO/CAP Guideline Recommendations for HER2/neu testing in breast cancer (Wolff et al. Arch Pathol Lab Med 2007; 131:18) • Image analysis can be an effective tool for achieving consistent interpretation • A pathologist must confirm the image analysis result • Image analysis equipment (including optical microscopes) must be calibrated, subjected to regular maintenance & internal QC evaluation • Image analysis procedures must be validated Canadian National Consensus Meeting on HER2/neu testing in breast cancer (Hanna et al. Current Oncology 2007; 14:149-53) • Use of image analysis systems can be useful to enhance reproducibility of scoring • Pathologists must supervise all image analyses FDA clearance for CAIA in vitro diagnostic use of HER-2/neu, ER, and PR IHC has been obtained by several companies CAIA vs. Manual Score Remmele & Schicketanz. Pathol Res Pract 1993; 189:862-6 “Subjective grading of slides is a simple, rapid and useful method for the determination of tissue receptor content and must not be replaced by expensive and time-consuming computer-assisted image analysis in daily practice.” Data on CAIA & IHC Early studies showed CAIA was no better than visual analysis (Schultz et al. Anal Quant Cytol Histol 1992; 14:35-40) Few studies have shown that manual & CAIA are comparable (Diaz et al. Ann Diagn Pathol 2004; 8:23-7) Most studies found CAIA to be superior to manual methods (Taylor & Levenson. Histopathology 2006; 49::411-24; McClelland et al. Cancer Res 1990; 50:3545-50; Kohlberger et al. Anticancer Res 1999; 19:2189-93; Wang et al. Am J Clin Pathol 2001; 116:495-503; Turner et al. USCAP 2008 abstract 1694). • Provides effective qualitative & quantitative evaluation • More consistent than manual & digital microscopy • More precise (scan per scan) than pathologists One study showed agreement between different CAIA systems: Chroma Vision ACIS & Applied Imaging Ariol SL-50 (Gokhale et al. Appl Immunohistochem Mol Morphol 2007; 15:451-5) Published Considerations Expense of CAIA may be hard to justify where volumes are low Image analysis frequently requires interactive input by the pathologist Increased time requirements Systems may be discrepant when tumor cells have low levels of staining Interfering non-specific staining within selected areas Images must be free from artifacts Small amounts of stained tissue can erroneously generate lower scores CAIA Systems • ImageJ (NIH developed freeware) • Adobe Photoshop software (Lehr et al J Histochem Cytochem 1997; 45:1559-65) • Automated Cellular Imaging System (Chroma Vision) • Pathiam (BioImagene) • Applied Imaging Ariol (Gentix Systems) • Spectrum (Aperio) Image Analysis & Algorithms Object-Oriented Image Analysis (morphologybased) Involves color normalization, background extraction, segmentation, classification & feature selection Separation of tissue elements (e.g. tumor epithelium) from background (e.g. stroma) permits selection of areas of interest & filtering out of unwanted areas Region of Interest (ROI) is subject to further image analysis (computation of diagnostic score) Quantification of results Digital Algorithm Courtesy of BioImagene Courtesy of BioImagene Courtesy of BioImagene Validation & Implementation at Baystate Health Distant medical centers Significant breast IHC caseload Need to mimic daily practice • avoid central (single user) image analysis Bandwidth limitations Whole slide imager availability Professional reluctance to read digital images Key Components Multimedia PC upgrade Spot Diagnostic digital cameras for each workstation Pathiam (BioImagene) web-based application Server (Oracle database + application + image file storage) Training & Validation WORKFLOW CONTROL IHC PATIENT IHC FOV ANALYSIS REPORT GENERATION NEED FOR STANDARDIZATION Calibrated Workstations FOV IHC Analysis FFPE breast cases routinely stained for ER, PR & HER2-neu Standardized camera acquisition settings (calibration) Pathologists (n=3) acquired 3-5 FOVs (each at 20x Mag.) Uniform jpg image file formats used (4 Mb) Post-processing image manipulation was avoided Control parameter set defined/IHC run (default/modified) ER/PR nuclear staining analyzed using the Allred scoring system (i.e. proportion + intensity score = TS) HER-2/neu membranous staining evaluated per ASCO/CAP 2007 recommendations (0, 1+, 2+, 3+) Manual vs. CAIA comparison tracked (IHC score, time & problems) FISH for HER2/neu obtained on several cases ER/PR Correlation (N=29) Biomarker ER+ Concordant Discordant Cases Cases 16 0 ER - 4 2* PR+ 14 0 PR - 4 3* * 3 cases HER-2/Neu Results (N=28) CAIA Manual Scoring Score 0/1+ 2+ 0/1+ 16 1* 2+ 3+ FISH RESULTS: * Negative (Ratio 1.04) ** Abnormal (Ratio 6.5) 3 3+ 1** 4 HER-2/Neu FISH Correlation Manual Score CAIA Score FISH Result 0 0 Negative (1.06) 0 0 Negative (0.93) 1 1 Negative (1.04) 1 0 Negative (1.00) 1 0 Negative (1.07) 1 0 Negative (1.66) 2 1 Negative (1.04) 3 2 Abnormal (6.5) Challenging Cases Infiltrating Lobular Carcinoma Cytoplasmic Staining Lessons Learned Decentralized CAIA for IHC designed to mimic daily surgical pathology workflow in practice is feasible Image acquisition requires standardization Tissue heterogeneity may impact FOV selection (whether biological or due to IHC variation) Pathologists must supervise CAIA systems Future Prospects Adopt virtual workflow-centric systems feasible for routine practice (that may potentially show better results) • E.g. Whole slide imaging (WSI) to eliminate the need to standardize different systems Automatic ROI selection & image analysis Shortened analysis time AP-LIS & CAIA system integration • To improve workflow • Permit disparate systems to access the same digital images & case data Learning algorithms • Systems that improve with experience following pathologist feedback Clinical outcome studies are needed • In one study, CAIA for ER IHC yielded results that did not differ from human scoring against patient outcome gold standards (Turbin et al. Breast Cancer Res Treat 2007) Acknowledgements Christopher N. Otis, MD Giovana M. Crisi, MD Andrew Ellithorpe, MHS Peter Marquis, BA BioImagene TRANSFORMING PATHOLOGY: Emerging technology driving practice innovation