A Visual Approach to Helping Instructors Integrate, Document, and

advertisement
Submitted to Journal of College Science Teaching, May 2015
A Visual Approach to Helping Instructors Integrate, Document, and Refine Active Learning
Fisher, B. A, Solomon, E. D., Leonard, D.A., Mutambuki, J.M., Cohen, C. A., Luo, J., Pondugula, S., Frey,
R.F.
5/24/2015
1
Submitted to Journal of College Science Teaching, May 2015
Abstract
If instructors are to integrate active learning effectively in courses in Science, Technology, Engineering,
and Mathematics (STEM), they need an accurate account of when and how they are integrating active
learning--and of how students are responding. Without such an account, instructors may perceive that
they are incorporating more active learning than observers document, or they may miss opportunities
to target aspects of the implementation that may be adjusted to improve effectiveness. This article
describes a visual approach to integrating observational data into self-evaluation and peer review of
teaching, practices that can lead to adoption of evidence-based active-learning strategies in STEM.
While our approach has specific relevance during this period of reform in STEM education, it was
designed to be implemented for undergraduate courses across the disciplines. The presentation of
observational data in a timeline provides a “big-picture” view of observed class sessions that captures
the sequencing of instructional strategies and the “ebbs and flows” of student participation—in a
chronological format that coheres with how instructors often visualize a class session. Such a view can
help instructors see where these strategies meet their instructional goals, and where these strategies
might be refined and improved.
5/24/2015
2
Submitted to Journal of College Science Teaching, May 2015
A Visual Approach to Helping Instructors Integrate, Document, and Refine Active Learning
As instructors and institutions respond to calls for broad-based integration of active learning
into undergraduate STEM curricula (e.g., American Association for the Advancement of Science, 2011;
Olson & Riordan, 2012), there is a clear need not only to train faculty in effective implementation of
these methods (Andrews, Leonard, Colgrove, & Kalinowski, 2011), but also to document what occurs in
the classroom when these methods are implemented (Hora & Ferrare 2014; Lund et al., 2015).
Instructors need an accurate account of when and how they are integrating active-learning strategies-and of how students are responding. Without such an account, they may perceive that they are
incorporating more active learning than observers document (Ebert-May et al., 2011), or they may miss
opportunities to target aspects of the implementation that may be adjusted to improve effectiveness.
Recently, researchers have developed and applied a variety of tools that may be used to
document instruction in STEM. These tools include the Reformed Teaching Observation Protocol (RTOP)
(Ebert-May et al., 2011; Sawada, et al., 2002); the Teaching Dimensions Observation Protocol (TDOP)
(Hora, 2013), and the Classroom Observation Protocol for Undergraduate STEM (COPUS) (Smith, Jones,
Gilbert, & Wieman, 2013). Among other uses, observational data can be employed in practices that can
accelerate improvements in teaching, including self-evaluation and peer review of teaching —whether
by colleagues in the department or by faculty developers (Gormally, Evans, & Brickman, 2014; Hora,
2013, Smith et al., 2013; Smith, Vinson, Smith, Lewin, & Stetzer, 2014). This article describes a visual
approach to integrating observational data into these practices.
Our approach displays observational data in a timeline showing instructor and student behaviors
as they co-occur, along with observed levels of student note-taking and attention. This approach
employs data gathered with a protocol that we developed—the Observation Protocol for Active
Learning (OPAL). However, it can be adapted for use with other protocols, such as TDOP or COPUS, that
5/24/2015
3
Submitted to Journal of College Science Teaching, May 2015
document instructor and student behaviors within short intervals (e.g. Hora, 2013; Smith et al., 2013).
While our approach has specific relevance during this period of STEM-education reform, it was designed
to be used to document various modes of teaching and learning—including active learning—in
undergraduate courses of different types, sizes, and levels across the disciplines.
The timeline we developed is designed to be intuitive for instructors to review and understand,
so that they may immediately use the data to reflect on, refine, and improve their teaching. The timeline
creates a “big-picture” representation of how individual instructors integrate multiple instructional
strategies either simultaneously (e.g. lecturing with questions, demonstration with discussion) or in
succession (e.g. group work followed by whole-class discussion, followed by lecture). As Hora (2013)
notes in regard to an alternative timeline approach employed with TDOP data, the chronological display
of observational data can make visible the “sequencing and dosage” of specific teaching strategies
across a class session (p. 24). Our timeline also makes visible how students respond and contribute to
these strategies, including the numbers of questions asked and answered, as well as the level of student
note-taking and attention observed.
The chronological display of observational data differs from approaches that display the relative
proportions of different instructional strategies in pie charts or graphs (e.g. Smith, et al., 2013; Smith, et
al., 2014). The latter approach is useful when measuring the prevalence of instructional strategies across
class sessions and semesters, and when assessing quantitative changes in instruction as they occur over
time (Hora and Ferrare, 2014; Smith, et al., 2013). However, an approach that uses counts of
instructional strategies that have been implemented at different times during one or more class sessions
does not provide instructors with a visual representation of what occurred within the chronological
framework of each observed session. In contrast, a timeline approach can help instructors visualize
5/24/2015
4
Submitted to Journal of College Science Teaching, May 2015
observed class sessions as they often plan for and experience their classes: as sequentially organized
blocks or segments of instruction and interaction.
Discussion of the OPAL timeline with a colleague during peer-review of teaching, or review of
the timeline by the instructor for the purpose of self-evaluation, facilitates critical thinking about such
issues as i) whether the observed occurrence and timing of specific instructional strategies coheres with
the instructor’s goals and intentions for the class session; ii) whether the timeline reflects the optimal
timing and sequencing of different strategies, according to the instructor’s objectives; and iii) whether
there appears to be a correlation between specific pedagogical approaches and student participation,
attention, and note-taking. For example, reviewing the timeline can provide instructors with a specific
idea of how they are spacing active learning within a class session, a factor that has been shown to
influence students’ attention (Bunce, Flens, & Neiles, 2010). The OPAL timeline therefore provides a
form of documentary feedback that can be instrumental in helping instructors reflect on, and make
changes in, their teaching.
Observational Protocol for Active Learning (OPAL)
This study employs data collected through application of the Observational Protocol for Active
Learning (OPAL). Similar to protocols such as COPUS, OPAL was designed to provide descriptive, rather
than evaluative, feedback. The activities documented by OPAL were either adapted from TDOP and
COPUS or developed by our research team, with the goal of designing a protocol that would be broadly
applicable across disciplines and different types of courses. OPAL observers do not need to be experts in
pedagogy or in the disciplinary content taught in the class being observed, but they do need to be
trained in how to apply the protocol. In the case of our study, the observers have a range of expertise in
teaching and in STEM disciplines; they include instructors, faculty-development staff, graduate students,
5/24/2015
5
Submitted to Journal of College Science Teaching, May 2015
and postdoctoral fellows. OPAL training requires 5-8 hours. (For additional details, see Supplemental
Materials).
Table 1 displays the categories of activities documented with OPAL. The first category, Behavior
Codes (instructor and student), includes examples of codes that fall under each category. When any of
these behaviors occur during a two-minute interval, the observer marks the respective code with a
checkmark or tally (i.e., occurrence sampling). Multiple codes may be marked in each interval. OPAL
includes codes for different types of lecturing, including Lec (Lecturing), Lpv (Lecturing with Pre-Made
Visuals), and Lint (Interactive Lecturing). This approach facilitates documentation of a variety of lecturing
styles (Hora & Ferrare, 2014).
All of the OPAL codes are recorded as nominal-level data; in other words, they are recorded as
occurring, or not, during each interval. In this study, however, a few codes are also recorded as ratiolevel data, with the observers marking each time the behavior occurs during each two-minute interval.
Examples include PQv (Pose Question Verbally) and AnQ (Answer Question). Thus, if the instructor asks
two questions and elicits three answers, then the instructor PQv is marked twice and student AnQ is
marked three times during that interval. Documenting the numbers of questions and answers allows an
instructor reviewing OPAL data to consider whether the documented number coheres with the
instructor’s plans for a given segment of the class.
The second category, Activity Levels, includes student attention and note-taking. This approach
is similar to that which is employed for “engagement level” in the COPUS procedures, in which levels are
marked as low (less than 20% of students), medium (between 20-80% of students), or high (more than
80% of students) (Smith, et al., 2013). However, in the case of OPAL, these activities are specifically
defined as attention and note-taking (activities that can be conflated in the COPUS procedures). The
note-taking code includes a response option of zero for instances when no students appear to be taking
5/24/2015
6
Submitted to Journal of College Science Teaching, May 2015
notes. (Note-taking may be performed by writing in a notebook or typing on a laptop or other mobile
device.) The recording methodology applied for Activity Levels differs from that applied for Behavior
Codes, in that the observer uses “scan sampling,” by scanning the whole room to assess levels of
attention and note-taking at the end of every two-minute interval .
Table 1. Sample Activities documented with OPAL.
Behavior Codes (occurrence sampling)
Observed
Student
Instructor
Behavior Category
Codes
Listening
LI, LS
Responding
AnQ,* VT*
Activities
QG, WG
Assessment
TQ, SP
Administration
Adc, Adt
Lecture
Lec, Lpv, Lint
Activities
PQv,* PSb, ADV
Follow-Up
Sfu, Dfu
Activity Levels (scan sampling)
Activity
Levels documented
Note-taking
Zero, low, medium, high
Attention
Low, medium, high
Note 1: All behavior codes are recorded as nominal-level data (i.e. marked as either occurring or not in
each two-minute interval). In addition, in this study, codes with an *are also recorded as ratio-level data
(i.e. the number of occurrences per two-minute segment is recorded).
5/24/2015
7
Submitted to Journal of College Science Teaching, May 2015
Note 2: A full list of OPAL codes can be found in the supplementary materials. Below are listed the
examples found in Table 1. Student Codes: LI = Listening to Instructor, LS = Listening to Student, AnQ =
Answering Question, VT = Voting with Technology, QG = Discussing Question in Groups, WG = Working in
Groups on Worksheet Activity, TQ = Test/Quiz, SP = Student Presentation. Instructor Codes: Adc =
General Course Administration, Adt = Administration of Test or Quiz, Lec = Lecturing, Lpv = Lecturing
with Pre-Made Visuals, Lint = Interactive Lecturing, PQv = Posing Question Verbally, PSb = Problem
Solving on the Board, ADV = Active Demonstration or Video, Sfu = Summary Follow-Up, Dfu = Discussion
Follow-Up Codes were adapted from TDOP (Hora, 2013) and COPUS (Smith et al., 2014), or developed
by the research team.
Thus far, we have used OPAL to observe 144 class sessions. This number includes observation of
29 instructors teaching 20 unique courses in 12 departments. We have assessed the reliability and
robustness of the tool by calculating inter-rater reliability using the Krippendorff’s alpha statistic and the
online statistical calculator ReCal OIR (Freelon, 2013). For 7% of the classes, two observers documented
the session. Average inter-rater reliability for the behavior codes, using the Krippendorff’s alpha
statistic, was 0.82.
Creation of Visual Timeline
After each class is observed, the research team creates a detailed visual timeline displaying the
data collected, using Microsoft Excel (for a sample, please see Supplemental Materials). The detailed
timeline shows all of the marked instructor and student behaviors, as they were observed occurring
during the class session, as well as the observed levels of student attention and note-taking. This version
of the timeline contains a large amount of data—more than instructors can assimilate and review during
a discussion of the observed class session (typically one-hour in duration). Therefore, we created a
5/24/2015
8
Submitted to Journal of College Science Teaching, May 2015
streamlined version of the timeline for use during peer-review and self-evaluation by instructors (Figure
1). In each case, we preserve the detailed timeline for research purposes.
5/24/2015
9
Submitted to Journal of College Science Teaching, May 2015
Figure 1. Sample streamlined OPAL timeline.
[Course Title]: [Instructor]
[Date of observed class]
Q
L
Ad
0-2
LA
Li
LP
L
2-4
Li
Q
LH
LP
4-6
Li
A
Problem solving
with groupwork
PS
Q
M
LH
Ad
6-8
LA
Li
V
G
8-10
LA
V
G
FQ
Q
F
F
A
A
Q
Li
Q
Li
Q
D
Q
LP
Interactive lecture
D
IL
Q
LH
IL
Q
LH
IL
Q
LH
Li
R
Li
A
Li
A
Li
A
IL
Q
A
LH
10-12 12-14 14-16 16-18 18-20 20-22 22-24 24-26 26-28 28-30
Li
A
Li
Li
Li
Q
A
Attention
Instructor Behaviors
A
Answer
Ad
Administration
D
Demonstration/Video
F
Follow-up
FQ
Follow-up with questions
IL
Interactive lecturing
L
Lecture with no visuals
LH
Lecture with hand-made visuals
LP
Lecture with pre-made visuals
M
Moving around room
PS
Posing problem solving activity
Q
Question
Student and Instructor
Notes:
For Notetaking and Attention:
> 80% of students
20-80% of students
< 20% of students
Zero (Note-taking only)
Number of Q's and A's
M
Ad
Demonstration
Notetaking
Students
Minutes
Activity Levels
Lecture with
questions
Instructor
Behaviors
Admin
Minutes
Admin
Q
0-2
Lecture with
questions
2-4
Q/A
Q/A
4-6
Problem solving
with groupwork
Q/V
6-8
V
8-10
Student Behaviors
A
Answer
G
Groupwork
LA
Listeing to administration
Li
Listening
Q
Question
V
Voting
Demonstration
Interactive lecture
Q
Q
Q/A
Q/A
Q
Q
Q/A
Q/A
Q
Q/A
Q/A
Q/A
Q
Q/A
Q/A
Q/A
Q/A
Q/A
Q/A
Q/A
Q/A
Q
Q/A
Q/A
Q/A
Q/A
10-12 12-14 14-16 16-18 18-20 20-22 22-24 24-26 26-28 28-30
5/24/2015
10
Submitted to Journal of College Science Teaching, May 2015
The streamlined timeline differs from the detailed version in two ways. First, similar codes are
consolidated. For example, OPAL codes for questions, such as PQv (Pose Question Verbally) and ChQ
(Brief Question to Check Comprehension) are consolidated and marked on the streamlined timeline as Q
(Question). Second, we add brackets and labels to indicate recognizable segments of pedagogical
strategies, e.g. “lecture with questions,” “lecture,” “interactive lecture” or “problem-solving with group
work.” This step is performed by a senior member of the research team, who examines the detailed
timeline and identifies segments of related instructor behaviors. In cases where it is not immediately
clear how to label a segment, multiple members of the research team discuss possibilities and
determine the appropriate label. In both the detailed and the streamlined version, the ratio-level data
(number of questions and answers) are shown on a separate timeline (denoted as Number of Q’s and
A’s). In this timeline, each question or answer is represented by one cell, and questions and answers by
students and instructors are merged into a single number for each two-minute interval. For a complete
list of streamlined codes and examples of bracket labels, please see the supplemental materials.
Instructor Feedback
Preliminary feedback, gathered via a qualitative survey on instructors’ perceptions of the
usefulness of the OPAL timeline, suggests that instructors value the timeline as a tool that helps them
visualize, think critically about, and modify the sequence and timing of methods used during a class
session. A lecturer in physics remarked, for example, that without the OPAL timeline, “I can note the
major aspects of a class (e.g., lecture, example problem, small group work, etc.), but it's almost
impossible to remember all the details of the interactions within each activity. [Reviewing] the OPAL
data [provided] a convenient way to see the breakdown of each in-class activity, [to] note how and
when my students responded to what I was doing, and to note which activities kept them most active
and engaged. It also helped me easily target segments of low engagement and think about how I could
5/24/2015
11
Submitted to Journal of College Science Teaching, May 2015
increase engagement and student activity during those times” (Fisher & Frey, 2015, p. 6). When multiple
class sessions were observed, this instructor added, reviewing multiple OPAL timelines could either
indicate trends and/or indicate if one class was an anomaly (and then lead you to look at why).” An
engineering instructor remarked, “I had a qualitative feel for how I blocked out my class session time.
These data helped to quantify that apportionment and also [to] face the reality that there was not as
much two-way interaction as I had perceived.” An instructor in Chemistry remarked, moreover, that
moving from a detailed to a streamlined version of the timeline was especially helpful: this new timeline
made the data “much easier to digest [in terms of] both specifics that are occurring during the twominute intervals and how the lecture is broken down into larger chunks.”
One instructor was surprised when the timeline revealed that students were not taking notes
during the segment in which the instructor facilitated discussion of responses to a “clicker” question.
The instructor and the reviewer were then able to formulate new strategies for signaling to students
that the content of the discussion was essential to their learning in the course and therefore should be
included in students’ notes. In subsequent class sessions, the instructor used the chalkboard to record
the problem-solving approach generated during the discussion. An OPAL timeline produced through
observation of one of these later classes showed that the students were indeed taking notes when the
instructor used the chalkboard in this way. This example underscores the importance of separating
note-taking from attention in classroom observations of student behavior.
Conclusion
Preliminary feedback on the OPAL timeline suggests that it provides a means for faculty to “see”
their teaching not only with greater clarity and accuracy, but also with a wider angle of vision. The OPAL
timeline provides a “big-picture” view of observed class sessions that captures the sequencing of
different instructional strategies and the “ebbs and flows” of student participation—in a chronological
5/24/2015
12
Submitted to Journal of College Science Teaching, May 2015
format that coheres with how instructors often visualize a class session. Such a view can help instructors
see where these strategies meet their instructional goals, and where these strategies might be refined
and improved. Reviewing OPAL data may help instructors develop “a clearer vision of their own teaching
before and after they make a change, . . . [as well as] observational and critical skills that they can then
apply to reflect on their own teaching or to observe a colleague” (Fisher & Frey, 2015, p. 6). We have
begun to integrate review of OPAL timelines into the practice of peer-review of teaching in a mentoring
program for junior faculty in STEM. By evaluating the usefulness of this approach to faculty mentors, as
well as mentees, we will better understand the usefulness of the OPAL timeline in helping instructors
make sustainable changes in their teaching.
Acknowledgements
The authors wish to thank the Association of American Universities and the Professional and
Organizational Development (POD) Network in Higher Education for their support of this project. We
also would like to thank the faculty members who have participated in the project and contributed to its
development.
5/24/2015
13
Submitted to Journal of College Science Teaching, May 2015
References
American Association for the Advancement of Science. (2011). Vision and change in undergraduate
biology education: a call to action. Washington, DC.
Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. (2011). Active learning not associated
with student learning in a random sample of college biology courses. CBE-Life Sciences
Education, 10, 394-405.
Bunce, D. M., Flens, E. A., & Neiles, K. Y. (2010). How long can students pay attention in class? A Study
of student attention decline using clickers. Journal of Chemical Education, 87(2), 1438-1443.
Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long. T M., & Jardeleza, S. E., (2011). What we
say is not what we do: Effective Evaluation of faculty professional development programs.
Bioscience, 61(7), 550-558.
Fisher, B. A., & Frey, R. F. (2015). Using documentary tools to foster the practice of scholarly teaching.
National Teaching and Learning Forum, 24(2), 4-6.
Freelon, D. (2013). ReCal OIR: Ordinal, interval, and ratio intercoder reliability as a web service.
International Journal of Internet Science, 8(1), 10-16.
Gormally, C., Evans, M., & Brickman, P. (2014). Feedback about teaching in higher ed: neglected
opportunities to promote change. CBE-Life Sciences Education, 13(2), 187-199.
Hora, M. T. (2013). Exploring the use of the Teaching Dimensions Observation Protocol to develop finegrained measures of interactive teaching in undergraduate science classrooms. University of
Wisconsin–Madison, Wisconsin Center for Education Working Paper 2013-6. Retrieved from:
http://www.wcer.wisc.edu/publications/workingPapers/papers.php
5/24/2015
14
Submitted to Journal of College Science Teaching, May 2015
Hora, M. T., & Ferrare, J. J. (2014). Remeasuring postsecondary teaching: How singular categories of
instruction obscure the multiple dimensions of classroom practice. Journal of College Science
Teaching, 43(3), 36-41.
Lund, T. J., Pilarz, M., Velasco, J. B., Chakraverty, D., Rosploch, K., Undersander, M., & Stains, M. (in
press). The best of both worlds: Building on the COPUS and RTOP observation protocols to easily
and reliably measure various levels of reformed instructional practice. CBE—Life Sciences
Education, 14(2).
Olson, S., & Riordan, D. G. (2012). Engage to excel: Producing one million additional college graduates
with degrees in science, technology, engineering, and mathematics. Report to the
President. Executive Office of the President.
Sawada, D., Piburn, M. D., Judson, E., Turley, J., Falconer, K., Benford, R., & Bloom, I. (2002). Measuring
reform practices in science and mathematics classrooms: The reformed teaching observation
protocol. School Science and Mathematics, 102(6), 245-253.
Smith, M. K., Jones, F. H., Gilbert, S. L., & Wieman, C. E. (2013). The Classroom Observation Protocol for
Undergraduate STEM (COPUS): A new instrument to characterize university STEM classroom
practices. CBE-Life Sciences Education, 12(4), 618-627.
Smith, M. K., Vinson, E. L., Smith, J. A., Lewin, J. D., & Stetzer, M. R. (2014). A Campus-Wide Study of
STEM Courses: New Perspectives on Teaching Practices and Perceptions. CBE-Life Sciences
Education, 13(4), 624-635.
5/24/2015
15
Download