Uploaded by hjtercero2020

Basic life support skills training in a first year medical curriculum six years experience with two cognitive constructivist designs

Medical Teacher
ISSN: 0142-159X (Print) 1466-187X (Online) Journal homepage: https://www.tandfonline.com/loi/imte20
Basic life support skills training in a first year
medical curriculum: six years’ experience with two
cognitive–constructivist designs
MD, PhD, MHPE Halıi İbrahım Durak, Agah Çertuğ, Ayhan Çalişkan & Jan Van
To cite this article: MD, PhD, MHPE Halıi İbrahım Durak, Agah Çertuğ, Ayhan Çalişkan & Jan
Van Dalen (2006) Basic life support skills training in a first year medical curriculum: six years’
experience with two cognitive–constructivist designs, Medical Teacher, 28:2, e49-e58, DOI:
To link to this article: https://doi.org/10.1080/01421590600617657
Published online: 03 Jul 2009.
Submit your article to this journal
Article views: 2069
View related articles
Citing articles: 7 View citing articles
Full Terms & Conditions of access and use can be found at
Medical Teacher, Vol. 28, No. 2, 2006, pp. e49–e58
Basic life support skills training in a first year
medical curriculum: six years’ experience with
two cognitive–constructivist designs*
Ege University Faculty of Medicine, Turkey; 2University of Maastricht, The Netherlands
Rationale: Although the Basic Life Support (BLS)
ability of a medical student is a crucial competence, poor BLS
training programs have been documented worldwide. Better
training designs are needed. This study aims to share detailed
descriptions and the test results of two cognitive–constructivist
training models for the BLS skills in the first year of medical
Method: A BLS skills training module was implemented in
the first year curriculum in the course of 6 years (1997–2003).
The content was derived from the European Resuscitation Council
Guidelines. Initially, a competence-based model was used and was
upgraded to a cognitive apprenticeship model in 2000. The main
performance-content type that was expected at the end of the
course was: competent application of BLS procedures on manikins
and peers at an OSCE as well as 60% achievement in a test
consisting of 25 MCQ items. A retrospective cohort survey
design using exam results and a self-completed anonymous student
ratings’ questionnaire were used in order to test models.
Results: Training time for individual students varied from 21
to 29 hours. One thousand seven hundred and sixty students were
trained. Fail rates were very low (1.0–2.2%). The students were
highly satisfied with the module during the 6 years.
Conclusion: In the first year of the medical curriculum,
a competence-based or cognitive apprenticeship model using
cognitive-constructivist designs of skills training with 9 hours
theoretical and 12–20 hours long practical sessions took place
in groups of 12–17 students; medical students reached a degree of
competence to sufficiently perform BLS skills on the manikins and
their peers. The cognitive–constructivist designs for skills training
are associated with high student satisfaction. However, the lack
of controls limits the extrapolation of this conclusion.
The term Basic Life Support (BLS) refers to maintaining
an airway and supporting breathing and the circulation.
It comprises the following elements: initial assessment of
the person, airway maintenance, expired air ventilation
(rescue breathing; mouth-to-mouth ventilation) and chest
compression. When all combined, the term cardiopulmonary
resuscitation (CPR) is used (Handley et al., 2001). According
to European Resuscitation Council Guidelines almost 60
different skills are required to perform BLS (Handley et al.,
2001; Philips et al., 2001). Although effective BLS decreases
Practice points
. Better training designs are needed for BLS in medical
. Cognitive–constructivist training designs are promising
in skills training and associated with high student
. With a competence based, cognitive apprenticeship
model, the training program, including 9 hours
theoretical and 12 hours practical small group sessions,
is sufficient for BLS skills.
morbidity and mortality, and is a core skill for all healthcare
professionals, it has been documented by several authors
worldwide that training programs in this area are poor (Kaye
& Mancini, 1998; Garcia-Barbero & Caturla-Such, 1999;
Perkins et al., 1999; Jordan & Bradley, 2000; Phillips &
Nolan, 2001). Different schools vary greatly in training time
and programming of theory and practice. All programs taught
one and two-rescuer adult BLS. Most included the management of an obstructed adult airway. The majority also
covered adult BLS with spinal injury. Courses concentrate
training on practical demonstration and practice of BLS.
Assessment of student competence in BLS was carried out in
most courses, usually after initial training. Although BLS
guidelines are disseminated widely, there is no standard for
the program and there is still great confusion what the exact
content and educational approach should be (Kaye &
Mancini, 1998; Garcia-Barbero & Caturla-Such, 1999;
Jordan & Bradley, 2000; Phillips & Nolan, 2001). More
standardization seems to be called for. Taking current
educational principles into account, carefully designed BLS
skills training programs are needed for health professionals.
_ Ibrah
_ Durak, MD, PhD, MHPE, Ege Üniversitesi T|p
Correspondence: HalIl
Tel: þ90 232 343 67 70; email: [email protected]
* This manuscript has been presented as a Master Thesis for University of
Maastricht MHPE program.
ISSN 0142–159X print/ISSN 1466–187X online/06/020049–10 2006 Taylor & Francis
DOI: 10.1080/01421590600617657
_ Durak et al.
H. I.
Educational perspectives of skills training
In order to design and test an educational program, while
placing ‘how do people learn’ in the center, it is essential to
use sound educational principles and consider their evidential implementation results in different conditions. In this
process, the content and objectives of teaching, prior knowledge of learners, teaching and learning environment and
methods, assessment of learners’ achievements and the
program evaluation are the main issues. In terms of skills
training, there is a growing trend to introduce clinical skills
earlier in the curriculum (Lam et al., 2002). Numerous
reports cite examples of cognitive–constructivist curricular
reform that include clinical skills training and the creation of
clinical skills centers (Dent, 2001), use of simulation
(Issenberg et al., 1999; Issenberg, 2002) and development
of the outcome measures such as OSCE that more accurately
and reliably assess clinical competence (Whelan, 2000).
Moreover, recent experiences in design of skills training focus
on cognitive–constructivist views of learning and emphasize
the adult learning principles, competence-based instruction,
expert modeling and situated learning (Hamo, 1994;
Stillman et al., 1997; Wilson & Jennet, 1997; Boulay &
Medway, 1999; Issenberg, 2002; Patrick, 2002; Rolfe &
Sanson-Fisher, 2002).
According to the cognitive–constructivist view, skills
development consists of three definable stages; (1) initial
skill acquisition; (2) skill compilation (proceduralization);
and (3) skill automaticity (Ford & Kraiger, 1995; Patrick,
2002a). Through practice, feedback, reflection and experience, declarative knowledge is compiled and proceduralized.
Sequencing the content, practice, whole and part drill,
carefully monitoring students’ performance, preventing or
correcting misconceptions, constructive feedback, formative
performance assessment are emphasized in the training
process (Ford & Kraiger, 1995; Patrick, 2002b). The use of
skills guidelines and/or checklists has been shown to be
beneficial in exposing medical students to practical procedures (Hunskaar & Seim, 1985; Bruce, 1989; Hamo, 1994;
Boulay & Medway, 1999).
Competency-based education has become dominant at
most stages of medical education (Leung, 2002) which
consists of a functional analysis of professions’ occupational
roles, translation of these roles into outcomes, and assessment of learners’ progress in the demonstrated whole task
performance of these outcomes. Its potential advantages
include individualized flexible training, transparent standards, and increased public accountability (Leung, 2002).
The Cognitive Apprenticeship Model is based on a
practical educational approach. It reflects a situated perspective by seeking to contextualize learning (Brown et al., 1989).
It has been proposed as an instructional method for
imparting expert models to novices and includes six subprocesses to be undertaken by the trainer and students,
namely: modeling, coaching, scaffolding, articulation, reflection and exploration (Ford & Kraiger, 1995). The first three
sub-processes are under the control of the trainer and last
three are performed by the students.
Assessment validates intended learning outcomes and
gives unique opportunity for program evaluation and
should be congruent with sound educational principles.
If the educational principles of the curriculum are not
reflected in and reinforced by the assessment program, the
‘hidden curriculum’ of assessment objectives will prevail
(Harden & Gleeson, 1979; Hafferty, 1998). The Objective
Structured Clinical Examination (OSCE) format emphasizes
the assessment of competence in a structured fashion and
widely impacted the assessment program of many medical
schools during the last 30 years (Harden & Gleeson, 1979;
Mennin & Kalishman, 1998; Fowel & Bligh, 2001).
Student ratings are one of the data sources with the
potential to address the evaluation of educational interventions, namely: materials, staff, effectiveness and outcomes of
the educational process (Marsh, 1984; McKeachie, 1996;
Greenwald, 1997). In our BLS training design we tried to use
almost all the educational principles given above.
Study aims and research questions
Although it is a well-known procedure and one of the major
skills for the health care professionals, an evidential scarcity
of adequately designed BLS training programs is a reality.
This fostered us to retrospectively analyse our experience and
we aimed to share our 6 years cognitive–constructivist model
BLS Skills Training Program while focusing on two main
aspects; (1) a description of the training program’s context,
educational approach and the instructional methods; and
(2) an evaluation of students’ achievements and satisfaction.
In order to test our design we analyzed three research
Did students develop sufficient knowledge and skills
to competently perform BLS skills within provided
instruction in the 6 years?
To what level were the students satisfied with learning
BLS skills within this context and educational
Have the students’ competencies and the level of
satisfaction changed in the years?
In 1997, the Medical Education Committee (UMEC) of
Ege University, Izmir,
Turkey, recognized the need for
competence-based Basic Life Support (BLS) skills training
in early years of the medical curriculum and delegated the
training design and implementation tasks to a multiprofessional group from the Department of Anesthesiology
and the Medical Education Unit (since 1999, the Department
of Medical Education; DME). Before this initiative the BLS
and Advanced Trauma Life Support (ATLS) courses were
taught only at the 5th year Anesthesiology clerkship. The BLS
skills training module has been in action for 6 years (1997–
2003) in the first year of curriculum while BLS reinforcement
and ATLS course still occurs at 5th year.
Organization and structure of the program
At the beginning, the planning group adopted a modular,
competence-based design including lectures, video demonstration, and skills tutorials (stage 1). The module’s
Basic life support skills training in a first year medical curriculum
Table 1. Basic properties of implemented BLS skills training modules.
Educational yearsa
Number of students
Number of rotated groups
Number of skill tutorial groups
Students per trainer
Practical hours per student
Theoretical hours per student
Total training hours per student
Fail rates (%)
2, 2
1, 1
1, 0
1, 4
1, 2
1, 0
Four tutors facilitated the training sessions.
Each rotation group was divided into eight small groups. Sessions were divided into two parts and two small groups rotated
in these sessions.
Each rotation group was divided into four small groups.
pedagogical background was upgraded to another cognitive–
constructivist design (cognitive apprenticeship) model in
2000 (stage 2). MCQ tests and checklist-based performance
assessment on the manikins were used for assessment of
students’ achievements. Students’ evaluation of training
questionnaires (BLS-SEQ) were developed and used in
each stage.
While we were training the students at a temporary skills
laboratory, in 2000 a permanent skills laboratory was
completed and we moved. This laboratory has far better
conditions than the previous one. The study guide (which
includes a yearly schedule, a translated version of ERC-BLS
guidelines and skills training checklists) was developed and
upgraded yearly. More than 16 skills tutors were involved in
the process. Table 1 summarizes the main properties of the
modules in the related years.
Content and objectives
Content was determined from ERC guidelines. Initial
assessment of the case, airway management, recovery
position, adult and pediatric CPR are the main competence
Our objectives at the end of training were that students:
must have in-depth knowledge and understanding on
CPR Algorithms as a basis of procedurals skills of BLS;
must be able to assess the situation appropriately
(initial assessment and field diagnosis) and apply
correct initial procedures with sufficient speed, considering the emergency case (safety, conscious check,
call for help);
must be able to correctly manage airway (including
mouth control, neck position, foreign body removal,
from upper respiratory tract) or give recovery
must be able to timely and sufficiently apply rescue
breathing, ventilation in a procedural manner;
must be able to correctly assess vital signs and
appropriately apply pulse checking procedure;
must be able to apply a well dosed cardiac massage
in terms of correct place, appropriate rhythm and
pressure in time; and
accurately use automated external defibrillator
(added, year 2000) on infant, child and adults manikins
and peers (recovery position).
Instructional methods and their implementation
All the skills tutorials took place at the skills laboratory with
Ressus AnneTM and Laerdal’s CPR trainer manikins and
students themselves for the recovery position.
Figure 1 illustrates the module’s educational approaches
employed by the years.
. Stage 1 (1997–1999)
Students were trained in six skills tutorials of 3 hours each.
Before the skills tutorials, two introductory lectures on
general BLS concepts and principles and one video session
were provided. Skills tutorials were: (1) initial assessment and
airway management; (2) foreign body removal and recovery
position; (3) one rescuer adult CPR; (4) two rescuer adult
CPR; (5) pediatric CPR; and (6) self study. In the first five
tutorials, after reviewing the skills check-lists we clearly
demonstrated the skills and allowed students to practice. We
provided feedback and encouraged their peers to give
feedback. Students practiced the particular procedure at
least three times in the tutorials and received extensive
In the next year, the number of tutors could not afford the
20 hours practical with 372 students and it was decreased to
14 hours (six 2-hour skill tutorials and 2 hours self study).
The module was repeated similarly in 1999.
. Stage 2 (2000–2003)
The time had come to add the Automated External
Defibrilator and revise the content by the guidance of ERC
2000 guidelines. The ERC guidelines sequenced the BLS
task into eight meaningful procedures. Based on these
sequences and particular meaningful procedures, while we
were upgrading our design to another (cognitive apprenticeship) we prepared four main skills tutorials for the adult cases
as a core of the module and assumed that students could
transfer their procedural knowledge and skills to pediatric
_ Durak et al.
H. I.
Figure 1.
The BLS Skills Training module’s educational approaches.
cases. Therefore, we set up only one skills tutorial for
pediatric cases later than three adult CPR tutorials. The BLS
algorithm-based skill clusters are presented in Figure 2.
Because many students complained about the quality of
the given information via a documentary film, we removed
it and instead, we added one scenario-based reinforcementrepetition session. Students were encouraged to reflect their
procedural knowledge on various case scenarios, articulate
and discuss with their peers. The activities are presented in
Table 2.
The components of the cognitive apprenticeship model
used in stage 2 and their counterparts are presented in
Table 3.
Tutors and tutor training. The tutors were anesthesiologists
and trained general practitioners from DME. All tutors were
regular users of BLS and Advanced Trauma Life Support
procedures in their work. The Department of Anesthesiology
(DA) had content expertise and DME had general pedagogical knowledge. Neither department had a sufficient number
of qualified staff members to train large student groups with
the new instructional methods. Through the collaboration
with Departments, starting from the first year, we systematically trained each other. In this tutor training process, we
developed content specific pedagogical knowledge as well as
BLS tutoring skills.
Subjects. One thousand seven hundred and sixty first year
students of EUFM who trained in BLS skills between 1997–
2003 participated in the study. The comparative analysis of
students’ ratings and exam performances in terms of years
and types of instructional design was used to answer the
research questions.
Procedure. A case study format was used for description
of the module. The number of students, learning
environment, contents, main properties of the educational
approaches employed, instructional formats, durations
and student pass–fail rates were described for each year.
Modifications of the module’s elements were elaborated.
For addressing the research questions, a retrospective
cohort survey design by means of self-completed students’
evaluation questionnaires and students’ assessment scores
were used.
. Students’ assessment scores
The main performance-content type that was expected at
the end of the training was the competent application of BLS
procedures in various conditions on the manikins and peers
at the OSCE station and 60% achievement in a test of
25 MCQs.
In order to reach precision on our test results we
developed an evaluation checklist. Pilot testing was done by
observing at least two tutors. After corrections, we trained
the observers. We assessed both the Basic Life Support
procedural knowledge (MCQ) and skills (OSCE). In each
OSCE, at least two stations were developed; one for
pediatric and one adult life support skills. While one
station used a conscious case story, the other used an
unconscious case scenario. Within this approach, we tried
to overcome the content specificity problem of testing
medical competence and thus increase content validity.
The number of participating students was sufficient to
allow conclusions at group-level. Each station consisted of
one BLS task which required more than fifteen skills.
In order to increase fidelity, we used peers as manikins
in the conscious adult case scenario (for recovery positioning skills).
Basic life support skills training in a first year medical curriculum
Figure 2. Basic Life Support Skills and their clustered skills tutorial at the module.
Table 2. The types and sessions of instructional activities of module by the stages.
Instructional activities
Introductory lessons
Video film
Chain of survival and BLS algorithms
Checking consciousness, airway management,
foreign body removal and recovery position
One rescuer CPR procedures for adults
Two rescuer CPR procedures for adults
Pediatric CPR and airway management
Automatic external defibrillation and
general repetition of CPR skills
Scenario based reinforcement and repetition
Self study-drill
Documentary film about BLS
Small group discussion
Skill tutorial
Small group discussion and practice
Study with peers
We set our standards by an Angoff procedure as a
test-centered method (Wilkinson et al., 2001). At least
four tutors took part in each Angoff procedure for the
OSCE stations and in a modified Angoff procedure for
MCQ tests. We defined the minimum pass level (MPL).
The students who achieved the just pass level received 60
points. Because of the strict University Exam Legislation,
Stage 1
Stage 2
this recoding procedure was applied. A sample OSCE
station and assessment MCQ questions are presented in
appendix 6.1. and 6.2.
. Students’ Evaluation Questionnaire (SEQ)
For the first 3 years we used SEQa. Students were asked to
rate their agreement with seven statements on a five-point
_ Durak et al.
H. I.
Table 3. Components of the Cognitive Apprenticeship Model and their counterparts in the BLS module.
1. Modeling
Explanations, correct and
fluid demonstration
2. Coaching
Observing student performance,
providing feedback, providing
corrective instruction
Suggesting, hints, physical props
3. Scaffolding
4. Articulation
5. Reflection
6. Exploration
Summarizing, describing,
reasoning, etc.
Evaluating own learning process
Pursuit of new goals,
testing hypothesis or assumptions
Module’s training activities
Goal Setting
Knowledge acquisition
Elaboration, networking
Accomplishment of the
task (motivation)
Introductory lectures
Video session
Algorithm tutorial
Demonstration in skills tutorials
Learning is constructive
process—constructive acting
Expert model
Self directed learning
Articulation and practice in
the skills tutorials
Algorithm tutorial
Scenario session
Peer evaluation
Self study
Video session
Algorithm tutorial
Reader and training checklists
Skills tutorials’ process
Scenario session
Video session
Algorithm tutorial
Adapted from Ford (1995).
Likert scale, ranging from 1 (completely disagree) to 5
(completely agree). We asked students to answer anonymously ‘how they found the module’ addressing physical
setting, module duration, resource and material, skills tutors
and opportunity to involve actively sessions.
While we were upgrading the module to a cognitive
apprenticeship model in 2000, we added few questions about
the new model, while maintaining the original questions
of the SEQa. SEQb consisted of three distinctive subsets.
It consisted of six ‘input’, four ‘process’ and two ‘output’
items, in total 12 statements. The first seven items were
exactly the same as the SEQa.
Congruent with the new educational approach, we added
five items which aimed to evaluate students’ perceptions of
the degree of their learning motivation, relevancy of the
content, information load, tutors’ guidance in the learning
process and their enjoyment while learning. Both instruments
included the students’ written comments that were asked in
an open-ended form for collecting the other individual
perceptions and elaboration of their ratings.
In developing both SEQa and SEQb, we used an item
pooling approach. All the tutors put some items into the pool
where they believed the item belonged. Later we reviewed
the items and selected the most important concepts to be
evaluated by students. The second step was a meeting with
the students. We asked their opinion and remarks on clarity
and understandability of the questionnaire and revised
the wording. The last step was piloting. We asked twenty
students who were in the first year but had not attended the
module. According to the pilot results, the final corrections
were made by a tutors meeting.
Questionnaires were administered at the end of exams.
Response rates were 81.1% for SEQa and 88.4% for SEQb.
The total response rate was 85.1%. Cronbach Alfa coefficients were found to be 0.78 for SEQa and 0.83 for SEQb.
Although we did not reach 0.80 standard point of
reliability coefficient for SEQa (Bryman & Cramer, 2002),
the written comments provided us with deeper elaboration.
The students’ written comments were analyzed by a simple
Standard was absolute for the Likert type items.
The percentage of low (<3) and high (>3) rating score and
basic statistics of items (means, percentiles, standard deviations) were used to define strong and weak aspects.
The clients of the evaluations were the students
(as information providers and customers), module planning
group and the tutors (as information receivers for valuing,
reflecting toward improvement of the module) and dean’s
office (as curriculum administrator). We interpreted the
assessment and SEQ results in the module planning group
for an improvement plan through the years. Instructor and
tutors met several times in the year, shared their own views
and collaboratively interpreted the students’ ratings.
Instructional changes were made based on these meetings
and on the number of students in that year.
SPSS for Windows Statistic Package (version 11.5)
was used for data analysis.
Students’ performances on assessments
Fail rates were very low (1.0–2.2%). In the MCQs exams,
almost all the students achieved high marks (80.5 12.1) and
performed well on the manikins and their peers at the OSCE
stations (81.7 12.5). The views of the tutors support these
In terms of students’ scores (MCQ and OSCE), there
was no difference between the years. Significant correlation
Basic life support skills training in a first year medical curriculum
Table 4. Means and standard deviations (between brackets) of the students’ scores in the years.
Assessment procedure
n ¼ 136
n ¼ 372
n ¼ 301
n ¼ 293
n ¼ 258
n ¼ 400
77.3 (13.5)
81.3 (13.0)
79.5 (12.5)
81.9 (12.8)
81.3 (11.5)
82.0 (11.8)
80.9 (11.6)
81.5 (12.2)
81.0 (11.2)
81.6 (12.4)
81.3 (12.2)
81.8 (13.0)
was found between the mean scores of MCQ and OSCE
(Pearson ¼ 0.281, p < 0.001) in all the years.
mean scores were slightly better than its followers and this
caused moderate significance.
. Item 8. Level of enjoyment while learning.
Students’ evaluation results
Table 5 shows the yearly survey results.
Average ratings of input variables varied between 3.5 to
4.5 indicating a positive impression which we also see in
process variables (4.1–4.5). As a control item, we asked about
information overload in a negative formulation (item 9).
Students indicated they were not overloaded with information (2.4–3.0).
Adults learn if they are motivated and if the training
gives them perspective on relevance of the skills to be learned
with regards to their profession/future job (Peyton, 1998).
Reflecting these two main adult learning principles, at
stage 2, output items asked the students’ motivation to
learn and perceptions of the module’s relevancy to their
perceived future profession. A similar positive impression was
found in these variables (3.9–4.2) and the module was
regarded as highly relevant for their future profession as well
as for following years in the curriculum.
Mean scores of all positive items of students’ satisfactions
were quite stable at around 4. Item scores got closer by the
years. Although we decreased the practical hours, this stable
high satisfaction was observed in the last years.
One way ANOVA shows significant differences in
students’ satisfaction among nine items. According to the
Bonferroni post hoc analysis results, Table 6 indicates the
causes of differences. Also, further elaboration of these
changes has been discussed below. Because there is no
significant differences for the item 3, 11 and 12 by years, we
did not include them following analysis.
Mean differences between 1st and 4th year modules is the
cause of significance. Students’ evaluation was relatively
lower in 1st year than 4th.
. Item 9. Load of given information.
. Item 10. Guidance information.
. Item 3, 11 and 12 do not show significant differences.
These items did not exist in SEQa. In the 5th and 6th year
students’ perceptions were moderately more negative than
the 4th year.
The assessment scores were generally high and fail rates
were low. The MCQ and OSCE scores showed a significant
According to students’ ratings, they were highly satisfied
with the module. The response rate of the questionnaire was
high and reliable.
According to students’ free comments, we attributed the
students’ high level of satisfaction to two main perspectives:
. Item 1. Physical setting
As seen in the table there is no difference in stage 1 and
stage 2.
. Item 2. Duration of the module
There is a moderate significant difference caused by 3rd year.
. Item 4. Module notes and study guide
At the first years of each stage, module notes and study
guides get relatively lower scores from students.
. Item 5. Tutors’ knowledge and skills to teach BLS skills.
. Item 6. Tutors’ coaching.
. Item 7. Active involvement in the sessions.
Item 5 did not exist in SEQa. We only compared the
SEQb means. However, in items 5, 6 and 7, the 4th year
Pedagogical principles of the module allowed us to
align the instruction constructively, and the cognitive–
constructivist approach fitted well with the students’
expectations and needs. Supporting the tutorials with
scenarios increased the reality of the training, permitting complete practice, exploration and reflection. One
important comment is that, in order to use scenarios a
careful task analysis is required. BLS skills should be
clustered by sequential procedures that are applicable
to individual scenarios. This makes instruction contextbound and outcome-focused thus coherent with the
adult learning principles (Peyton, 1998).
The timing and time devoted to practice of the BLS
skills training in the curriculum is another explanation
of high level of students’ satisfaction. BLS skills training
in the first year fostered students’ motivation toward
learning. Students saw themselves as doctors who save
lives. Actually, they clearly wrote these last words in
the questionnaires. Our results support the usefulness
of introducing clinical skills in the early years of
curriculum (Lam et al., 2002).
In terms of significant differences in student ratings
between the years and/or stages, for physical settings
(item 1), we attribute this difference to the change in skills
1. Physical settings
were adequate
2. Total duration of the
module was sufficient
3. Various appropriate
educational devices
were used
4. Module notes and study
guide helped me to learn
5. Skills trainers had
sufficient level
of knowledge and
skills to teach BLS
6. Skills trainers were good
COACHES to teach BLS
7. There was enough
opportunity provided
to be actively involved
in the sessions
8. I enjoyed this module
while I was learning
9. In the sessions
there was overload
of information
10. During the module
‘‘where I must be when’’
information was always
sufficiently provided
11. I believe I’ll use these
skills in my further
educational years
12. I believe I’ll use these
skills in my
professional life
4.1 (0.9)
4.1 (0.9)
4.3 (0.6)
4.3 (0.9)
4.1 (0.9)
4.1 (0.6)
4.2 (0.9)
3.5 (0.9)
4.1 (1.0)
4.3 (0.6)
4.1 (0.8)
3.9 (0.8)
3.4 (0.8)
N ¼ 345
3.4 (0.6)
N ¼ 103
* The mean difference is significant at the 0.05 level.
4.3 (0.8)
4.1 (0.9)
4.3 (0.9)
4.0 (0.9)
4.0 (0.9)
3.8 (0.9)
3.6 (0.8)
N ¼ 208
3.9 (0.9)
4.2 (0.9)
4.5 (0.7)
2.4 (1.0)
4.5 (0.8)
4.3 (0.8)
4.4 (0.8)
4.4 (0.8)
3.7 (1.0)
4.1 (1.0)
4.1 (0.8)
3.8 (1.0)
N ¼ 265
3.9 (0.9)
4.2 (0.9)
4.3 (0.8)
2.9 (1.2)
4.3 (0.8)
4.2 (0.8)
4.1 (0.8)
4.3 (0.9)
4.0 (0.9)
4.0 (1.0)
4.0 (0.7)
4.0 (0.9)
N ¼ 210
3.9 (1.0)
4.2 (0.9)
4.3 (0.8)
3.0 (1.2)
4.3 (0.9)
4.2 (0.8)
4.2 (0.8)
4.3 (0.8)
4.0 (0.9)
4.1 (0.9)
4.1 (0.7)
4.0 (0.9)
N ¼ 366
3.9 (0.9)
4.2 (0.9)
4.3 (0.8)
2.8 (1.2)
4.3 (0.9)
4.2 (0.9)
4.3 (0.9)
4.3 (0.9)
3.9 (0.9)
4.1 (0.8)
4.0 (0.8)
3.7 (0.9)
N ¼ 1497
Table 5. Educational years and number of responses and coverage rates and average means and standard deviations (in bracket) of the students’ ratings.
_ Durak et al.
H. I.
Basic life support skills training in a first year medical curriculum
Table 6. Bonferroni post hoc analysis results for significant differences.
(4), (7)
(4), (7)
(1), (7), (8)
(1), (4), (7)
(1), (4), (7)
(1), (2)
(6), (9), (10)
(5), (6), (9), (10)
The mean difference is significant at the 0.05 level. (The numbers in plain brackets indicate the SEQa–b items’ numbers.
The numbers in bold brackets indicate the SEQb items’ numbers.)
lab conditions. We do not have any objective findings to
explain the moderate difference in judgment of the duration
of the module (item 2) caused by 3rd year. We attribute the
moderate significance in 5, 6, and 7 seen by 4th year
(first year of stage 2) to high motivation of the tutors. In terms
of level of enjoyment while learning (item 8), we do not have
any attributable findings to explain this moderate difference
between the 1st and the 4th year. Perhaps being a novice in
small group teaching caused low satisfaction in the 1st year.
Item 9 (asks about information overload) and item 10
(asks about satisfaction on guidance information) reveal
that without changing the amount or type of information in
the last three years, it is possible to interpret that we found the
best satisfaction scores in the first year of the cognitive
apprenticeship adopted module (4th year). This general
tendency to rate higher in the first year of stage 2 might
explain the differences in these items.
According to WHO’s European survey in 1997, BLS
training in medical schools vary in content, format
(theory and/or practice) and time spent (13.3 9.7 hours)
(Garcia-Barbero & Caturla-Such, 1999) and there is still a
long way to go towards standardized training (Perkins et al.,
1999; Jordan & Bradley, 2000). The European survey points
out that more time is devoted to theoretical teaching
than practice. Turkish medical schools provide 6.9 6.6
hours for theoretical and 6.9 5.7 hours practical for BLS
training. In our designs, we devoted similar time for theory
(9 hours) but two to three times more for practice (12–20
As one limitation of this study, although we upgraded
the design to another one in stage 2, our findings do not give
enough specific information between the stages. We do not
know whether upgrading the educational approach to a
cognitive apprenticeship model did affect the students’
performance and evaluations. While we were focusing on
smaller group instruction by the years, we emphasized the
students’ own reflections. Although the number of practical
hours decreased in quantity, our results show that the quality
was stable, or even better. In the second stage, students are
likely to learn more meaningfully.
The module’s design and quality improvement perspectives triggered and fostered several local innovations.
These include the development and implementation of a
‘structured skills training format and the introduction of a
skills laboratory, interdisciplinary cooperation, skills tutor and
their training, skills test in Objective Structured Clinical Exam
In terms of a ‘positive side effect’ of this experience,
besides the module, we trained nearly 2000 government
employees who work as Primary Health Care Workers in
2001 in 28 courses, for one day each. We used a compact
program with the same pedagogical approach.
The results of this study suggest that: in the first year of the
medical curriculum, at the skills laboratory, in the frame of
the structured, outcome-focused, cognitive apprenticeship
model adopted, 9 hours theoretical (lecture, BLS algorithm
discussion) and 12–20 hours long practical training with a
12–17 students tutorial group, medical students can develop
sufficient knowledge and skills to competently perform BLS
skills on the manikins and their peers at the end of training.
This cognitive–constructivist design for skills training is
sufficient for performing BLS competently at OSCE and is
associated with stable, high student satisfaction. However,
as the last limitation of the study, the lack of controls limits
the extrapolation of this educational design in the acquisition
of BLS skills in the first year medical students.
We are thankful to our colleagues, Dr. S. Elif Törün,
Dr. Kevser Vatansever, Dr. Sürel Karabilgin, and Dr.
A. Hilal Bat| for their tutorship, and to all our students
involved in the BLS skills training 1997–2003.
Notes on contributor
AGAH ÇERTUĞ, Ege University Faculty of Medicine, Department of
Anesthesiology, Izmir,
Turkey, is the main instructor of the BLS skills
training program.
, of Ege University Faculty of
Medicine, Department of Medical Education, Izmir,
Turkey, are the
_ provided the original idea for this study.
module’s skills tutors, and HID
SAÇ contributed to data collection analysis, and HID
wrote the
JAN VAN DALEN, University of Maastricht, Skillslab, Maastricht,
The Netherlands provided support, guidance and amendments to the
BOULAY, C. & MEDWAY, C. (1999) The clinical skills resource: a review of
current practice, Medical Education, 33, pp. 185–191.
_ Durak et al.
H. I.
BROWN, J.S., COLLINS, A. & DUGUID, P. (1989) Cognition and the culture
of learning, Educational Researcher, 18, pp. 32–42.
BRUCE, N.C. (1989) Evaluation of procedural skills of internal medicine
residents, Academic Medicine, 64, pp. 213–216.
BRYMAN, A. & CRAMER, D. (2002) Concepts and their measurement,
in: A. Bryman & D. Cramer (Eds) Quantitative Data Analysis with SPSS
Release 10 for Windows, pp. 54–68 (New York, Routledge).
DENT, J.A. (2001) Current trends and future implications in the
developing role of clinical skills center, Medical Teacher, 23,
pp. 483–489.
FORD, K.J. & KRAIGER, K. (1995) The application of cognitive constructs
and principles to the instructional systems model of training: implications for needs assessment, design and transfer, International Review of
Industrial and Organizational Psychology, 10, pp. 1–47.
FOWEL, S. & BLIGH, J. (2001) Assessment of undergraduate medical
education in the UK: time to ditch motherhood and apple pie, Medical
Education, 35, pp. 1006–1007.
GARCIA-BARBERO, M. & CATURLA-SUCH, J. (1999) What are we doing in
cardiopulmonary resuscitation training in Europe? An analysis of a
survey, Resuscitation, 41, pp. 225–236.
GREENWALD, A.G. (1997) Validity concerns and usefulness of students’
ratings, American Psychologist, 52, pp. 1182–1186.
HAFFERTY, F.W. (1998) Beyond curriculum reform: confronting
medicine’s hidden curriculum, Academic Medicine, 73, pp. 403–407.
HAMO, I.M. (1994) The role of the Skills Laboratory in the integrated
curriculum of the Faculty of Medicine and health Sciences, UAE
University, Medical Teacher, 16, pp. 167–178.
HANDLEY, A.J., MONSIEEUR, K.G. & BOSSAERT, L.L. (2001) European
Resuscitation Council Guidelines 2000 for Adult Basic Life Support. A
statement from the Basic Life Support and Automated External
Defibrilator Working Group (1) and approved by the Executive
Committee of the European Ressusciation Conucil, Resuscitation, 48,
pp. 199–205.
HARDEN, R.M. & GLEESON, F.A. (1979) Assessment of clinical
competence using an objective structured clinical examination,
Medical Education, 13, pp. 19–22.
HUNSKAAR, S. & SEIM, S.H. (1985) Medical students’ experiences in
medical emergency procedures upon qualifications, Medical Education,
19, pp. 294–298.
ISSENBERG, S.B. (2002) Clinical skills—training makes perfect, Medical
Education, 36, pp. 210–211.
ISSENBERG, S.B., MCGAGHIE, W.C., HART, I.R., et al. (1999) Simulation
technology for health care professional skills training and assessment,
Journal of the American Medical Association, 282, pp. 861–866.
JORDAN, T. & BRADLEY, P. (2000) A survey of basic life support training
in various undergraduate health care professions, Resuscitation, 47,
pp. 321–323.
KAYE, W. & MANCINI, M.E. (1998) Teaching adult resuscitation
in the United States—time for a rethink, Resuscitation, 37,
pp. 177–187.
LAM, T.P., IRWIN, M., CHOW, L.W. & CHAN, P. (2002) Early
introduction of clinical skills teaching in a medical curriculum—
factors affecting stu|dents’ learning, Medical Education, 36,
pp. 233–240.
LEUNG, W.C. (2002) Competency based medical training, British Medical
Journal, 325, pp. 693–696.
MARSH, H.W. (1984) Students’ evaluations of university teaching:
dimensionality, reliability, validity, potential biases and utility, Journal
of Educational Psychology, 76, pp. 707–754.
MCKEACHIE, W.J. (1996) Student ratings of teaching, American Council of
Learned Society Occasional Paper, 33, pp. 12–17.
MENNIN, S.P. & KALISHMAN, S. (1998) Student assessment, Academic
Medicine, 73, pp. s46–s54.
PATRICK, J. (2002a) Learning and skill acqusition, in: Training Research
and Practice, pp. 19–74 (North Yorkshire, Academic Press).
PATRICK, J. (2002b) Task oriented analysis, in: Training Research and
Practice, pp. 131–168 (North Yorkshire, Academic Press).
PERKINS, G.D., HULME, J., SHORE, H.R. & BION, J.F. (1999) Basic
life support training for health care students, Resuscitation, 41,
pp. 19–23.
PEYTON, J.W.R., Ed. (1998) Teaching & Learning in Medical Practice,
(Guilford, Manticore Europe Ltd.).
PHILIPS, B., ZIDEMAN, D., WYLLIE, J., et al. (2001) European Resuscitation
Council Guidelines 2000 for Basic Paediatric Life Support,
Resuscitation, 48, pp. 223–229.
PHILLIPS, P.S. & NOLAN, J.P. (2001) Training in basic and advanced life
support in UK medical schools: questionnaire survey, British Medical
Journal, 323, pp. 22–23.
ROLFE, I.E. & SANSON-FISHER, R.W. (2002) Translating learning
principles into practice: a new strategy for learning clinical skills,
Medical Education, 36, pp. 345–352.
STILLMAN, P.L., WANG, Y., OUYANG, O., et al. (1997) Teaching and
assessing clinical skills: a competency-based programme in China,
Medical Education, 31, pp. 33–40.
WHELAN, G.P. (2000) Educational Commission of Foreign Medical
Graduates: lessons learned in a high stakes, high-volume medical
performance examination, Medical Teacher, 22, pp. 293–296.
WILKINSON, T., NEWBLE, D.I. & FRAMPTON, C.M. (2001) Standard setting
in an objective structured clinical examination. Use of global ratings of
borderline performance to determine the passing score, Medical
Education, 35, pp. 1043–1049.
WILSON, D.B. & JENNET, P.A. (1997) The Medical Skills centre at the
University of Calgary Medical School, Medical Education, 31,
pp. 45–48.