RADPEER™ Hani Abujudeh MD, MBA, FSIR Associate Professor of Radiology Massachusetts General Hospital Harvard Medical school Disclosures Book Royalties RADPEER™ RADPEER™ is a simple tool developed to allow physicians to do peer review during the course of a day’s work. When a new study is interpreted with an prior study for comparison, a peer review of the accuracy of the interpretation of the previous examination occurs. RADPEER™ Piloted RADPEER™ in 2001 Offered to members in 2002 e RADPEER ™ developed in 2005 Scoring changes implemented in 2009 4 Signal events American Medical Accreditation Program by the AMA in 1998 The publication of the, Institute of Medicine report To Err Is Human in March 2000. An ABMS task force on the maintenance of certification, Competence Initiatives: A Status Report Listed four components of maintenance of certification, which will be required by all specialty boards These components are professional standing, commitment to lifelong learning and periodic self-assessment, cognitive expertise, and the evaluation of performance in practice. 6 ACR RADPEER In response to the interests of the public and the health care community, the ACR convened a patient safety task force. The task force concluded that to meet the fourth requirement of maintenance of certification a successful peer review program must be national, uniform in structure and function across practices, accurate, facile, nonpunitive, and able to be integrated into a facility’s quality assurance program. The task force concluded that no existing programs met these criteria 7 RADPEER ™ Four point scoring system: 1. Concur with interpretation 2. Discrepancy in Interpretation/not ordinarily expected to be made (understandable miss) a. b. unlikely to be clinically significant likely to be clinically significant 3. Discrepancy in Interpretation/ should be made most of the time a. unlikely to be clinically significant b. likely to be clinically significant 4. Discrepancy in Interpretation/ should be made almost every time - misinterpretation of finding a. unlikely to be clinically significant b. likely to be clinically significant Scores of 2b, 3 or 4 should be reviewed through the facility’s internal QA process prior to submission to ACR 9 10 12 The median number of cases reviewed in RADPEER is 776 each year, which translates into 3 to 4 cases reviewed per working day. The RADPEER system is not designed to be a sole OPPE measure, but it can be incorporated into such programs. The committee discourages the use of scores as a means of competency assessment and encourages the maintenance of the nonpunitive nature and anonymity in scoring. 14 RADPEER could be expanded and used as a means to collect data on other aspects of quality and safety in radiology. Improving the performance of the system as a whole is far more beneficial than eliminating the outliers (Deming, W.E., Out of the Crisis2000, cambridge: MIT Press.) ( assessing competence vs improving performance) RADPEER™ April 2014 Over 1,170 participating groups Over 17,500 physicians 16 1200 1000 800 600 RADPEER Growth 400 200 0 FY 2005 17 FY 06 FY 07 FY 08 FY 09 Fy 10 FY 11 FY 12 FY 13 RADPEER™ 18 RADPEER™ Data submitted to ACR via website Reports for individuals and group available electronically ABR MOC Reports 19 20 21 RADPEER™ as a PQI Project Radiologists can select RADPEER™ to be used as one of their projects for Maintenance of Certification for the ABR (American Board of Radiology) The MOC process includes: 22 Evidence of professional standing (license) Lifelong learning and self-assessment (SAMs, CME) Cognitive expertise (exam) Practice quality improvement (PQI project such as RADPEER™) 23 Cost of RADPEER™ Participation Annual fee for participation in RADPEER™ is based on the number of physicians in the group 24 # Physicians Annual Fee 2-5 6-15 16-25 26-35 36-45 46-55 56-65 66-75 76-85 86-95 96-105 106-115 116-125 $800 $1,500 $2,200 $3,000 $3,800 $4,600 $5,400 $6,200 $7,000 $7,800 $8,600 $9,400 $10,200 https://radpeer.acr.org habujudeh@partners.org 25