Development of a CMEQUAL Survey Instrument

advertisement
CONFIDENTIAL DRAFT
A Standardized Approach to Assessing
Physician Expectations and Perceptions of Continuing Medical Education
Linda L. Casebeer, Ph.D.1
Richard M. Shewchuk, Ph.D.2
Nancy L. Bennett, Ph.D.4
Maziar Abdolrasulnia, Ph.D.1
Todd Jenkins, M.P.H.1
Corresponding Author:
Linda Casebeer, Ph.D.
300 Riverchase Parkway East
Birmingham, Alabama 35244
205-259-1500
linda.casebeer@ceoutcomes.com
1
Outcomes, Inc., Birmingham, Alabama
University of Alabama at Birmingham, Birmingham, Alabama
3
Sanofi-Aventis, Bridgewater, New Jersey
4
Consultant, Outcomes, Inc., Birmingham, Alabama
2
1
Introduction
The increasingly complex information needs of U.S. physicians result from rapidly expanding
science as well as changes in mandates for maintaining credentials, now required by all U.S.
medical specialty boards.1, 2 These complex information needs place increasing demands on
continuing medical education (CME) activities to provide access to current information that is
relevant to patient care.3-7 While the changing information needs are clear, it is difficult to
determine how well CME activities meet physician expectations for the best information. No
standard method has been developed in North America to assess how well CME activities meet
physician expectations for information critical to patient care. The Accreditation Council for
Continuing Medical Education (ACCME) requires accredited providers evaluate the
effectiveness of CME activities in meeting identified educational needs, defining compliance as
measuring satisfaction, knowledge, or skills.8 A recent updating of criteria emphasizes the
impact of CME programs on participants’ competence, performance, and patient outcomes.8
In addressing similar challenges related to complex information needs within the context of the
emergence of virtual libraries, the Association of Research Libraries developed a standard
evaluation approach to help libraries assess and improve library services.9 The goal of this
standard evaluation approach, known as LibQUAL, was to help librarians better understand user
perceptions of library service quality, collect and interpret library user feedback over time,
provide comparable assessment information from peer institutions, and identify best practices in
library service.9
LibQUAL was based on the SERVQUAL model and the Disconfirmation Theory, used to assess
satisfaction with the quality of service provided in the service sector.10
2
Disconfirmation is the difference between perceptions and expectations of service quality.11, 12
Perceived service quality is reflected in the amount and direction of the discrepancy between
customer expectations and perceptions of the provided service. 10 SERVQUAL was developed to
measure the gap between expectation and experience on a series of aspects of service quality.13
This article describes the development of an instrument to measure quality in CME, based on an
adaptation of the SERVQUAL model in order to help CME providers 1) assess user perceptions
of the quality of CME activities in meeting information needs, 2) systematically collect and
interpret feedback over time 3) provide comparable assessment information across activities and
providers, and 4) identify best practices in CME. This article describes the initial steps taken to
achieve these goals.
Adaptation of CMEQUAL using Nominal Group Technique
The disconfirmation theory and the SERVQUAL model, measuring the gap between customer
expectations for quality and the perception of actual service delivered, has been tested in a
number of different settings. 12, 14-17 However, the application of these models to CME has not
yet been assessed. In another field, the Association of Research Libraries used grounded theory
to determine the constructs important in the adaptation of SERVQUAL to libraries. A series of
interviews with faculty and students supported the design of the survey instrument for quality in
library service, specifically for the research library community.9
Adapting these qualitative methods to CME, an online virtual meeting approach was used to
conduct 4 initial nominal group sessions with 6-8 participants each. The purpose of these
sessions was to elicit the perceptions of CME providers and participants concerning evaluative
3
information that could be gathered from attendees in order to improve future CME activities.
Although a group activity, its evidence-based objectivity minimized the limitations of other
group processes, such as focus groups (e.g., unbalanced participation and/or influence across
participants) and has been associated with greater satisfaction among participants.18,19
A group of U.S. community-based practitioners, in practice at least 5 years, was invited by fax to
participate in one of two nominal group sessions; 16 practitioners registered to participate. A
sample of 16 nationally-accredited CME providers volunteered to participate in the second set of
nominal group sessions. Four panels were convened during January 2006, using a synchronous
Internet-based, virtual meeting room and basic long-distance teleconference calling. Panel
participants accessed the Internet site and were “seated” at a virtual table, where they were able
to see the names of the other participants. Participants were informed that the purpose of the
meeting was to tap into their unique insights, knowledge, and experiences to identify an array of
expectations that could be employed to enhance CME programming.
Participants worked independently for five minutes to develop personal lists of concise
statements/phrases in response to a question posted to their computer monitors: What onsite postactivity evaluative information could be gathered from attendees that would make planning and
delivery of CME activities more effective?
Each panelist presented his/her responses to the group. A structured “round-robin" format was
used, where each participant articulated a single response to the question without providing any
rationale, justification, or explanation. The session facilitator recorded each response on a virtual
flip chart that was posted online to help participants recall previously nominated responses. The
nomination process continued until all responses were exhausted. At the conclusion of the
4
nomination process, panel participants briefly discussed identified expectations for the purpose
of clarification - not evaluation - to ensure that every response was understood from a common
perspective. The final phase of the session consisted of a structured prioritization exercise with
each participant anonymously selecting the three most important expectations from the identified
list and the three least important expectations to enhance evaluation at the conclusion of a CME
activity. Panelists were not limited to selecting the expectations that they themselves had
nominated. The results of these panels are summarized in Table 1.
Development of a CMEQUAL Survey Instrument
Roughly patterned after the disconfirmation model used in SERVQUAL and LibQUAL, the term
CMEQUAL represents our adaptation of this paradigm to CME. The statements generated by
the panels provided a series of potential items for a CMEQUAL survey instrument. The
disconfirmation approach adopted for the CMEQUAL involves having CME participants
evaluated from two perspectives: First, what they expect, and second, the extent to which their
expectations for each item were met. Readability and usability were assessed using a cognitive
interviewing process with eight community-based physicians. Cognitive testing of survey
questions or ‘verbal probing’ has become an integral first step in the development of a survey
instrument.20-23 The technique allows an explanation of how answers are formed to find out if
respondents have the information needed to answer a question accurately and if the answer they
give accurately describes what they have to say. By identifying potential problems early, and
making appropriate revisions in the CMEQUAL survey, the quality of data can be improved.
Eight phone interviews were conducted for approximately one hour. Physicians were asked to 1)
think about the last CME activity they had attended, 2) read the instructions and each question on
5
the CMEQUAL survey, and 3) respond to each survey item. Each person was asked to explain
how each question was understood and how he arrived at the answers given. Responses to two
statements demonstrated the widest discrepancies: 1) translates trial data in a way that pertains to
the management of my patients and 2) provides evidence that will lead to changes in my
practice.
General comments included “unique, excellent, well done survey,” “very comprehensive,
addresses many concerns that could significantly improve the quality of CME,” and “excellent
idea to evaluate importance before the activity and perceptions after the activity.” All
participants concluded that the survey instrument was too long and should include no more than
10 questions. Instructions were considered clear. Suggestions were made to clarify language in
statements related to “clear evidence in support of presented material,” “targets content at the
appropriate level for me,” and the use of the term “experts” related to speakers and presenters.
The majority suggested deleting the item “monitors through the activity to ensure participant
needs are met,” since the statement was less clear in the context of the survey instrument. That
item was dropped from future drafts. When asked if they would fill out half the form before the
CME activity and half after the activity, respondents said they would fill out the entire survey at
the end unless instructed by a speaker to do so before the activity. The addition of an openended comments box was suggested.
Distillation of CMEQUAL Survey Instrument Items
In order to reduce the number of items included in the survey instrument, another panel of 51
community-based practitioners participated in a card sort exercise to cluster similar items. Panel
participants were mailed a group of 4 x 6 file cards, each printed with one of the 20 survey
6
statements, and were asked to use rubber bands to put together statements that seemed similar.
Four clusters were created by this panel (Figure 1). Psychometric analysis used multidimensional
scaling to distill the list to10 items.24
Feasibility and Pilot Testing
The feasibility of using a survey instrument that asked participants to rate expectations before a
CME activity and then to rate perceptions of the activity following participation was tested in a
series of 6 satellite symposia during March, 2006 at a large specialty society meeting. The survey
instruments were provided to participants prior to each activity without specific instructions from
the speakers; 341 of the more than 500 physician participants in these CME activities completed
both sections of the form. Following the feasibility testing, the CMEQUAL survey instrument
was pilot tested. The survey instrument was made available to a volunteer group of CME
providers to use in their CME activities between April and December 2006. Demographics of
462 responders are summarized in Table 2. Results are summarized in Table 3. A test of
internal consistency was conducted on these items using Cronbach’s alpha, with a resulting .821,
falling within the range of internal consistency recommended for survey research.25
Analyzing and Interpreting CMEQUAL Data Through Radar Graphs
The CMEQUAL survey instrument measures the gap between participant expectations for
quality and the perception of actual service delivered.
If perceived performance exceeds
expectations, the participant is said to be highly satisfied; if perceived performance equals
expectations, the participant is satisfied; and if perceived performance is less than expectations,
the participant is dissatisfied.12,16 A radar graph demonstrates the gaps representing the results of
the initial pilot data from 10 CME activities. (Figure 2).
7
CMEQUAL Survey Items and ACCME 2006 Updated Criteria
During 2006 as the CMEQUAL Survey was being developed, the ACCME issued updated
criteria for CME providers. The ACCME proposes rewarding accredited providers for moving
through three levels of accreditation designed to improve they way they provide CME.8 The
CMEQUAL survey items provide information to meet criteria at each of the levels, with
measurements of physician expectations, perceptions, and experiences in their CME activities
(Table 4). Addressing issues around commercial support, the CMEQUAL survey provides items
about balanced content and clear evidence, as well as effective planning, expected of CME
providers at Level 2. Going beyond Level 2, survey items also address the Level 3 focus on
integrating CME into the process for improving professional practice, as well as addressing
barriers to optimal patient management required (Level 3; criteria 16, 18-19). Based on the
updated criteria, an additional survey item has been added to address, the utilization of noneducational strategies to enhance change as an adjunct to its activities/educational interventions
(e.g. reminders, patient feedback (Level 3; criteria 17)). Overall, the CMEQUAL survey offers a
standardized approach to gathering data that allows providers to assess the degree to which the
CME mission of the provider has been met through the conduct of CME activities/educational
interventions (Level 1, 2, 3; criteria 12). This standardized approach offers the option of
incorporating a core set of questions with additional items more specific to individual programs.
CMEQUAL Revised for 2007
In reviewing the ACCME updated criteria released in September 2006, an additional item was
generated: “Provides me with supporting materials or tools for my office (reminders, patient
education materials, etc.).” In order to accommodate the additional item and remain within the
8
suggested 10 item limit, two items related to educational process (“includes opportunities to
learn from other participants” and “includes opportunities to dialogue with national experts,”)
which showed in pilot testing lower value placed on expectations by physicians compared to
other items, were combined into one item: includes interactive opportunities with faculty and
participants. The final CMEQUAL, 2007, is presented in Appendix A. Acknowledging the
potential difficulties for providers in asking participants to fill out half of the evaluation form
prior to the activity, we have included a CMEPERF 2007 which assesses the physicians
perceptions related to expectations following the activity. (Appendix B) This form of evaluation
has also been used within the context of the SERVQUAL model.26,27
Conclusions
The CMEQUAL project provides an opportunity for CME providers to use a standardized
evaluation tool that supports analysis of the impact of a CME activity for physician expectations,
comparison of a variety of activities, and data to evaluate accreditation criteria. Another benefit
to CMEQUAL may be that assessing items related to the ACCME updated criteria may influence
physician expectations of CME activities. During 2007, data collection will continue in larger
populations in order to interpret physician user feedback, to provide comparable assessment
information, and to identify best practices in CME.
9
Table 1. Evaluation Statements Nominated By Nominal Group Technique Panels
Q01: Targets content at the appropriate level for me
Q02: Enhances my understanding of issues related to this topic
Q03: Addresses competencies identified by my specialty
Q04: Provides reading material prior to the event
Q05: Provides fair and balanced information
Q06: Provides clear evidence in support of presented material
Q07: Presents content that is difficult to access elsewhere
Q08: Provides a print copy of material presented
Q09: Affords an opportunity to engage in dialog with experts in the field
Q10: Provides an opportunity to learn from colleagues
Q11: Allows adequate opportunity for participant interaction
Q12: Monitors throughout the activity to ensure that participant needs are being met
Q13: Employs innovative teaching methods
Q14: Provides an opportunity to assess what I have learned
Q15: Presents material applicable to my clinical practice
Q16: Allows me to develop new skills applicable to my practice
Q17: Uses cases to illustrate application of content
Q18: Translates trial data in a way that pertains to the management of my patients
Q19: Provides evidence that will lead to changes in my practice
Q20: Addresses barriers to optimal patient management
Q21: Reinforces or builds confidence in my current approaches
10
Table 2. Pilot Testing Participant Demographics
Years in practice:
% (N)
<5
5 – 10
11 – 20
> 20
19.2% (112)
14.2% (83)
28.3% (165)
38.3% (223)
Patients seen per week:
<5
5 – 10
11 – 15
> 15
11.2% (54)
8.5% (41)
6.8% (33)
73.5% (355)
Number of CME credits earned in the past year:
0 – 15
16 – 30
31 – 45
46 – 60
> 60
16.3% (75)
44.7% (206)
20.0% (92)
10.4% (48)
8.7% (40)
Number of CME programs attended annually:
0–5
6 – 10
11 – 15
16 – 20
> 20
43.8% (202)
25.4% (117)
13.9% (64)
6.5% (30)
10.4% (48)
11
Figure 1: Item Cluster Analysis**
Cluster 2†
Cluster 1*
Cluster 4§
Cluster 3‡
**Multidimensional scaling (MDS) was used to cluster CMEQUAL items based on
similarities
*Cluster 1: Relevance to patients
†Cluster 2: Educational process
‡Cluster 3: Knowledge and skills
§Cluster 4: Relevance to practice
12
Table 3. Pilot Test Data : Perceptions (met expectations) minus Expectations (importance)
Question
N
Addresses my most pressing questions
Addresses competencies identified by my specialty
Provides fair and balanced content
Provides clear evidence to support content
Includes opportunities to learn from other participants
Includes opportunities to dialogue with national experts
Includes opportunities to solve patient cases
Allows me to assess what I have learned
Translates trial data to patients I see in my practice
Addresses barriers to my optimal patient management
492
491
493
494
493
490
486
486
482
479
13
Mean (SD)
- 0.01 (0.92)
0.06 (0.86)
0.05 (0.82)
- 0.04 (0.76)
- 0.02 (1.05)
0.18 (1.02)
- 0.09 (1.11)
0.03 (0.96)
- 0.12 (0.91)
- 0.13 (0.93)
Respondents below
expectations (%)
36.7%
32.4%
32.4%
34.8%
37.7%
32.1%
41.1%
35.7%
42.6%
43.1%
Figure 2. Perceptions (met expectations) Minus Expectations
(importance) for 10 CME Programs
Addresses my most pressing questions
(36.7%)
Addresses barriers to my optimal patient
management (43.1%)
Addresses competencies identified by my
specialty (32.4%)
0.06
-0.01
Translates trial data to patients I see in my
practice (42.6%)
Provides fair and balanced content
(32.4%)
-0.13
0.05
-0.12
-0.04
Allows me to assess what I have learned
(35.7%)
0.03
-0.09
Includes opportunities to solve patient
cases (41.1%)
Provides clear evidence to support content
(34.8%)
-0.02
0.18
Includes opportunities to learn from other
participants (37.7%)
Includes opportunities to dialogue with
national experts (32.1%)
*Percent of CME participants that experienced negative disconfirmation or were dissatisfied
with the programs on each item
14
Table 4. CMEQUAL Survey Items And ACCME 2006 Updated Criteria
CMEQUAL and CMEPERF Survey ACCME Updated Criteria September 2006
Items
Addresses my most pressing
questions
16 The provider operates in a manner that integrates CME
into the process for improving professional practice
Addresses competencies identified
by my specialty
4 The provider generates activities/educational
interventions around content that matches the learner’s
current or potential scope of professional activities
Provides fair and balanced content
6 The provider develops activities/educational
interventions in the context of desirable physician
attributes
7. The provider develops activities/educational
interventions independent of commercial interests.
Provides clear evidence to support
content
10. The provider actively promotes improvement in health
care and not propriety interests of a commercial supporter.
Includes opportunities to learn from
other participants
5. The provider chooses educational formats for
activities/interventions that are appropriate for the setting,
objectives, and desired results of the activity
Includes dialogue with national
experts
5. The provider chooses educational formats for
activities/interventions that are appropriate for the setting,
objectives, and desired results of the activity
Includes opportunities to solve
patient cases
6. 6 The provider develops activities/educational
interventions in the context of desirable physician
attributes
16 The provider operates in a manner that integrates CME
into the process for improving professional practice
Allows me to assess what I have
learned
11 The provider analyzes changes in learners (competence,
performance, or patient outcome) achieved as a result of
the overall program activities/educational interventions.
Translates trial data to patients I see
in my practice
16 The provider operates in a manner that integrates CME
into the process for improving professional practice
18 The provider identifies factors outside the provider’s
control that impact on patient outcomes
Addresses barriers to my optimal
patient management
19 The provider implements educational strategies to
remove, overcome, or address barriers to physician change
15
Acknowledgments
The authors gratefully acknowledge support for this project from Sanofi-Aventis, as well as the
Post Graduate Institute of Medicine for assistance in feasibility testing, Marc Crawford and CME
Outcomes for developing a scannable version of the survey form and gathering pilot data, and
Michael Likos for assistance with data gathering and processing.
References
1. Miller SH. American Board of Medical Specialties and repositioning for excellence in lifelong
learning: maintenance of certification. J Contin Educ Health Prof. 2005; 25:151-6.
2. Schrock JW, Cydulka RK. Lifelong learning. Emerg Med Clin North Am. 2006; 24: 785-3.
3. Bennett NL, Casebeer LL, Zheng S, Kristofco R. Information-seeking behaviors and reflective
practice. J Contin Educ Health Prof. 2006 Spring;26(2):120-7.
4. Bennett NL, Casebeer LL, Kristofco RE, Collins BC. Family physicians' information seeking
behaviors: a survey comparison with other specialties. BMC Med Inform Decis Mak. 2005 Mar
22;5(1):9.
5. Godin P, Hubbs R, Woods B, Tsai M, Nag D, Rindfleisch T, Dev P, Melmon KL. A new
instrument for medical decision support and education: the Stanford Health Information Network
for Education. System Sciences, 1999. HICSS-32. Proceedings of the 32nd Annual Hawaii
International Conference on System Sciences.
6. Moore DE, Pennington, FC. Practice-based learning and improvement. J Contin Educ Health
Prof 2005; 23: S73-S80.
16
7. Sackett DL, Rosenberg WM. On the need for evidence-based medicine. J Public Health Med.
1995; 17: 330-4.
8. Accreditation Council for Continuing Medical Education, 2006. www.accme.org. [Accessed
December 16, 2006].
9. Cook C, Heath F, Thompson B. et al. LibQUAL 2004 Survey. Association of Research
Libraries, Texas A&M University, 2004. www.libqual.org.
10. Zeithaml VA, Parasuraman A, Berry L. Delivering quality service: Balancing customer
perceptions and expectations. New York, 1990, Free Press.
11. Churchill, G.A. and C. Surprenant, “An Investigation Into The Determinants Of Customer
Satisfaction” Journal of Marketing Research, 1982, 19, pp. 491-504.
12. Oliver, R.L. “A cognitive model of the antecedents and consequences of satisfaction
decisions,” Journal of Marketing Research, November 1980, 17, pp. 460-469.
17
13.Parasuraman A, Zeithaml VA. Berry LL, “SERVQUAL: A multiple-item scale for measuring
consumer perceptions of service quality,” Journal of Retailing, Spring 1988, 64: 1, pp. 12-40.
14. Fisk RP, Brown SW, Bitner MJ. Tracking the evolution of the services marketing literature.
Journal of Retailing. 1993; 60: 61-103.
15. Fontenot G. Behara R. Gresham A. Six sigma in customer satisfaction. Quality progress,
1994; 27: 73-76.
16. Oliver, R.L. “Measurement and evaluation of satisfaction processes in retail settings,”
Journal of Retailing, 1981, 57, pp. 25-48.
17. Parasuraman A, Berry LL, Zeithaml VA. Alternative scales for measuring service quality: A
comparative assessment based on psychometric and diagnostic criteria. Journal of Marketing,
1994; 70:201-230.
18. Delbecqu, Andre; van de Ven, Andrew; and Gustafson, David. (1975) Group techniques for
program planning: A guide to Nominal Group and Delphi. Chicago: Scott Foresman.
18
19.Kristofco R. Shewchuk R, Casebeer L, Bellande B, and Bennett N: Attributes of an ideal
continuing medical education institution identified through nominal group technique. J Contin
Educ Health Prof. 2005; 25(3): p. 221-8.
20. Collier, David & Brady, Henry E., eds. (2004). Rethinking social inquiry: Diverse tools,
shared standards. Oxford, UK: Rowman & Littlefield Publishers.
21. Willis, G. B., T. DeMaio, and B. Harris-Kojetin. 1999. "Is the Bandwagon Headed to the
Methodological Promised Land? Evaluating the Validity of Cognitive Interviewing
Techniques."In Cognition in Survey Research, edited by M. G. Sirken, D. Herrmann, S.
Schechter, N. Schwarz, J. Tanur, and R. Tourangeau, pp. 133–54. New York: John Wiley.
22.Willis, G. B. 2005. Cognitive Interviewing. Thousand Oaks, CA: Sage Publications.
23.Presser, S., M. P. Couper, J. T. Lessler, E. Martin, J. Martin, J. M. Rothgeb, and E. Singer.
2004. Methods for Testing and Evaluating Survey Questions. Public Opinion Quarterly. 68:
109–30
24.Kruskal, J. B., & Wish, M. (1978). Multidimensional scaling. Beverly Hills, CA: Sage
Publications.
25. Nunnelly, J.C. (1978) Psychometric Theory, 2nd ed. New York: McGraw Hill.
19
26. Cronin JJ, Taylor SA. “SERVQUAL versus SERVPERF: Reconciling Performance-Based
and Perceptions-Minus-Expectations Measurement of Service Quality.” Journal of Marketing.
January 1994, 125-131.
27. Cronin JJ, Taylor SA. “Measuring Service Quality: A Reexamination and Extension.”
Journal of Marketing. July 1992, 55-68.
20
Appendix A: CMEQUAL
Please complete the following evaluation of: <insert program name>
Date: <insert program date>
What is your degree?
_____
Number of years in practice?
_____
Number of physicians in your practice
_____
Number of patients you see each week?
_____
Approximate % of patients you manage for the
disease/s addressed by this activity?
_____
Number of CME credits earned last year?
_____
Is your practice principally patient care?
Type of practice:
 Community/private
 Academic
Zip code of your office
_____
Number of CME program attended annually?
_____
 Yes  No
 Hospital
 HMO
What is your specialty focus?
 Medical specialty
 Primary care specialty
 Surgical specialty
 Other
 Other
The purpose of this evaluation is to
receive your feedback in order to
improve future educational activities.
All responses are considered
confidential. Please complete the
questions in Column 1 BEFORE this
CME activity begins; after the CME
activity, complete Column 2.
BEFORE this CME activity
begins, please think about what
you expect from this CME
activity. Indicate how important
each of the following is to you by
circling a number from 1 to 7.
AFTER participating in this
CME activity, think about your
perceptions of the CME activity;
indicate how well each
statement characterizes your
perceptions by circling a number
from 1 to 7.
1
2
Importance to Me
in this CME Activity
Low
This CME Activity
Met My Expectations
High
Minimally
Completely
Addresses my most pressing questions
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Addresses competencies identified by my specialty
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Provides fair and balanced content
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Provides clear evidence to support content
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Includes opportunities to learn interactively from faculty and
participants
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Provides me with supporting materials or tools for my office
(reminders, patient materials, etc.)
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Includes opportunities to solve patient cases
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Allows me to assess what I have learned.
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Translates trial data to patients I see in my practice.
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Addresses barriers to my optimal patient management
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Compared to all other CME activities I have participated in over the past year, I would rate this program as:
Needs serious improvement
1
2
Average
3
4
A model of its kind
5
6
Do you expect your management strategies in this clinical area to change within the next six months?
No change
Possible change
1
2
3
4
5
6
7
Definite change
7
What percent of the content of this activity was new to you? ______ %
Major strengths and weaknesses of the program were:
Thank you for your feedback.
21
CMEQUAL, 2007, was developed collaboratively by investigators from Outcomes, Inc. and Sanofi-Aventis.
Appendix B. CMEPERF
Please complete the following evaluation of: <insert program name>
Date: <insert program date>
What is your degree?
_____
Number of years in practice?
_____
Number of physicians in your practice
_____
Number of patients you see each week?
_____
Approximate % of patients you manage for the
disease/s addressed by this activity?
_____
Number of CME credits earned last year?
_____
Is your practice principally patient care?
Type of practice:
 Community/private
 Academic
 Hospital
 HMO
Zip code of your office
_____
Number of CME program attended annually?
_____
 Yes  No
What is your specialty focus?
 Medical specialty  Primary care specialty
 Surgical specialty  Other
 Other
The purpose of this evaluation is to receive your feedback in order to improve future educational
activities. All responses are considered confidential. Please indicate how well each statement met
your expectations of this CME activity.
This CME Activity
Met My Expectations
Minimally
Completely
Addresses my most pressing questions
1
2
3
4
5
6
7
Addresses competencies identified by my specialty
1
2
3
4
5
6
7
Provides fair and balanced content
1
2
3
4
5
6
7
Provides clear evidence to support content
1
2
3
4
5
6
7
1
2
3
4
5
6
7
1
2
3
4
5
6
7
Includes opportunities to solve patient cases
1
2
3
4
5
6
7
Allows me to assess what I have learned.
1
2
3
4
5
6
7
Translates trial data to patients I see in my practice.
1
2
3
4
5
6
7
Addresses barriers to my optimal patient management
1
2
3
4
5
6
7
Includes opportunities to learn interactively from faculty
and participants
Provides me with supporting materials or tools for my
office (reminders, patient materials, etc.)
Compared to all other CME activities I have participated in over the past year, I would rate this program as:
Needs serious improvement
1
2
Average
3
4
A model of its kind
5
6
7
Do you expect your management strategies in this clinical area to change within the next six months?
No change
1
Possible change
2
3
4
Definite change
5
6
What percent of the content of this activity was new to you? ______ %
Major strengths and weaknesses of the program were:
Thank you for your feedback.
22
CMEPERF, 2007, was developed collaboratively by investigators from Outcomes, Inc. and Sanofi-Aventis.
7
Download