BSCS Presentation

advertisement
A Process for Analyzing Coherence
in Science Instructional Materials
Presenter: April Gardner
Collaborators: Joseph A. Taylor, BSCS; Rodger W. Bybee,
BSCS; Liu Enshan, Beijing Normal University
2 May 2010
AERA Annual Meeting
Denver, CO
Project Background
• Asia Society convened meetings of American and Chinese math
and science education leaders in mid-2000s
• This group encouraged collaborative work to examine and compare
curriculum standards and materials in the two countries
• In 2006, BSCS received a grant from the Office of Science
Education at NIH to collaborate with Beijing Normal University
biology educators to compare U.S. and Chinese biology textbooks
Initial project goals
Compare U.S. and Chinese secondary biology programs for:
• Foundational concepts
• Role of scientific inquiry
• Role of educational technology
Refining the goals
Goals were refined and expanded based on our recognition that:
• both the U.S. and China had developed and adopted national
science curriculum standards in the recent past (1996 for U.S. and
2003 for China),
• both countries had popular, pre-standards programs in use as well
as less-popular, post-standards programs in use, and
• it would be interesting to compare not only distinctions between
American and Chinese instructional materials, but also distinctions
between pre- and post-standards curriculum materials.
An emphasis on curricular coherence
As we continued this work, we further recognized that:
• international comparison studies identified curricular coherence as a
key distinction between high- and low-performing countries,
• focus and rigor have been described as two components of overall
coherence (Schmidt et al., 2005; Schmidt & Prawat, 2006), and
• focus and rigor are key features of science curriculum standards in
both the U.S. and China.
Thus, our emphasis changed to
• analysis of the coherence (focus and rigor) of one pre-standards and
one post-standards biology program from each country
A more interesting story
When we completed this work, we recognized that we had
developed and tested a method for analyzing the coherence of
instructional materials, and
that’s the more interesting story I’ll relate today . . .
Four programs reviewed
Program Country
Year
National
Standards
Released
A
China
2003
B
USA
1996
C
China
2003
D
USA
1996
Type of
Program
Prestandards
Prestandards
Poststandards
Poststandards
% of Schools
Using the
Program
Approximately
90%
Approximately
33%
Less than 5%
Less than 5%
Year of
First
Edition
Year of
Edition
Analysed
1997
2002/3
1991
2006
2005
2005
1997
2006
Process for evaluating coherence:
Background
• Adaptation of part of the Analyzing Instructional Materials (AIM)
process (Powell et al., 2002)
• Uses criteria based on research about how students learn science
• These criteria address both focus and rigor
Process for evaluating coherence:
Analyzing focus
Example:
1. Construct and score a
conceptual flow graphic (CFG)
•
•
•
•
•
Write the overarching concept
of chapter and major concept(s)
of each section
Depict strength of connections
among these using arrows of
varying thicknesses
Score by giving each
connecting link a score based
on its strength: 2 for strong, 1
for moderate, and 0 for weak or
no link
CFG score = sum of all links,
divided by sum if all links are
strong
Transform % score to a score
of 5, 3, or 1
Percent
Score
33%
weakly focused
1
33-66%
moderately focused
3
>66%
strongly focused
5
Process for evaluating coherence:
Analyzing focus, continued
2. Search instructional materials for evidence of focus based on
sequencing, context, and key concepts.
3. Evaluate evidence for strengths & weaknesses, and score as 5, 3,
or 1 (strongly, moderately, or weakly address criterion,
respectively).
Criterion
Sequencing
Context
Key Concepts
Abbreviated Description
-organized to promote student understanding
-links facts and concepts explicitly
-builds & extends previously developed concepts
-connects concepts strongly to overarching conceptual framework
-presented in engaging context related to real world experiences
-facilitates assimilation/reorganization of knowledge that builds on students’
prior conceptions and experiences
-structured using global, unifying themes of biology
-emphasizes connections across discipline areas
Process for evaluating coherence:
Analyzing rigor
4. Search instructional materials for evidence of rigor based on depth
of treatment, engagement of prior knowledge, student
metacognition, and abilities and understandings of scientific inquiry.
5. Evaluate evidence for strengths & weaknesses, and score as 5, 3,
or 1 as for the focus criterion.
Criterion
Depth of
treatment
Abbreviated Description
-treat concepts at developmentally appropriate level
-require students to apply & demonstrate understanding in >1 way
Engaging prior
knowledge
-help students make current understanding of concept explicit
-challenge/confront current thinking about concept
Metacognition
-include strategies to help students assess their own learning
-include strategies to help students reflect on what & how they learned
Abilities of sci.
inquiry
-require students to design & conduct scientific investigations
-require students to formulate & revise explanations based on data
Understandings
of sci. inquiry
-help students understand that scientists conduct investigations for a variety of
reasons
-help students understand that scientists use a variety of tools, technology, &
methods to enhance their investigations
Process for evaluating coherence:
Scoring
6. Focus, rigor, and overall coherence scores determined as indicated
in the table:
Component
Criteria
Possible
Range
Low
Moderate
High
Focus
CFG
Sequencing
Context
Key Concepts
4 – 20
4–8
9 – 15
16 – 20
Rigor
Depth of treatment
Prior knowledge
Metacognition
Abilities of sci. inquiry
Understandings of
sci. inquiry
5 – 25
5 – 11
12 – 18
19 – 25
Overall
Coherence
Focus & Rigor
Subscores
9 – 45
9 – 20
21 – 32
33 – 45
Research question
• Can the process we devised for assessing coherence
distinguish instructional materials developed before the
adoption of national science standards from those
developed after the adoption of the standards?
Hypothesis:
Science instructional materials developed after the adoption of
national standards will exhibit greater levels of coherence than
those developed before the adoption of national standards.
Procedure followed
1. Researchers were trained in the use of the modified AIM process
using an unrelated program.
2. Project director selected the genetics and ecology chapter(s) to be
analyzed from each of the four programs.
3. Three researchers independently reviewed the chapter(s),
identified the overarching chapter concept and section concepts,
and collected evidence to support their scores for the focus and
rigor criteria.
4. The research team met to compare and reach consensus on each
overarching chapter concept and the major section concepts.
5. The team then collaboratively constructed a CFG for each chapter.
6. The team discussed and revised scores on the focus and rigor
criteria to reach consensus on the scores.
Sample CFG: Program B
50% of possible strong links
Moderately focused (score = 3)
Sample CFG: Program D
89% of possible strong links
Highly focused (score = 5)
Example of evaluating rigor for
“understandings of scientific inquiry”
• Weak rigor (score = 1)
– The chapter described few or no scientists and their work. There is no
opportunity for students to connect how the evidence of scientists’
investigations leads them to their conclusions.
• Strong rigor (score = 5)
– The chapter described how the understanding of inheritance progressed
over time, through the ensuing work of scientists. For example, following
a detailed description of Mendel’s experiments, results, and elucidation
of the principles of inheritance, the text goes on to explain how some
results of genetic crosses conducted by other scientists did not fit
Mendel’s patterns. The text presents this puzzle as a rationale for the
subsequent work of T.H. Morgan who related the inheritance of genes
to newer understandings about chromosomes and their movement.
Morgan’s work then led to further studies about gene linkage and
mapping.
Findings
Ecology chapters:
Program/type
Country
Average Focus
Score/Level
Average Rigor
Score/Level
Overall
Coherence
A/pre-standards
China
12/Moderate
7/Low
19/Low
B/pre-standards
USA
14/Moderate
11/Low
25/Moderate
C/post-standards
China
14/Moderate
11/Low
25/Moderate
D/post-standards
USA
20/High
23/High
43/High
•
•
•
The process distinguished ecology chapters from the programs developed
before the adoption of national science standards from the U.S. program
developed after the adoption of standards, based on their levels of
coherence.
The process did not distinguish the U.S. ecology chapters developed before
national standards from the Chinese ecology chapters developed after the
national standards were adopted.
Programs developed after the adoption of national science standards
exhibited greater coherence than those developed before national
standards, for both China and the U.S.
Findings
Genetics chapters:
Program/type
Country
Average Focus
Score/Level
Average Rigor
Score/Level
Overall
Coherence
A/pre-standards
China
10/Moderate
9/Low
19/Low
B/pre-standards
USA
4/Low
9/Low
13/Low
C/post-standards
China
18/High
17/Moderate
35/High
D/post-standards
USA
17/High
21/High
38/High
•
•
The process distinguished genetics chapters from programs developed
before and after the adoption of national science standards, based on their
levels of coherence.
Programs developed after the adoption of national science standards
exhibited greater coherence than those developed before national
standards, for both China and the U.S.
Conclusions
• This adaptation of the AIM process is a promising
method for evaluating the coherence of science
instructional materials.
• This method can be used in research studies, curriculum
development, and professional development. For
example, it could be used:
– to study the relationship between the coherence of instructional
materials and specific student learning outcomes,
– by curriculum developers to strengthen the coherence of
materials they are developing, and
– in curriculum-based professional development to identify areas
in programs where coherence should be strengthened by the
teacher.
References cited on slides
Powell, J. C., Short, J. B. and Landes, N. M. (2002) Curriculum reform,
professional development, and powerful learning. In R. W. Bybee
(ed.), Learning Science and the Science of Learning (Arlington, VA:
NSTA Press), 121 -- 136.
Schmidt, W. H. and Prawat, R. S. (2006) Curriculum coherence and
national control of education: issue or non-issue? Journal of
Curriculum Studies, 38 (6), 641 -- 658.
Schmidt, W. H., Wang, H. C. and McKnight, C. C. (2005) Curriculum
coherence: an examination of US mathematics and science content
standards from an international perspective. Journal of Curriculum
Studies, 37 (5), 525 -- 559.
To download this presentation,
visit www.bscs.org/sessions
5415 Mark Dabling Blvd. Colorado Springs, CO 80918
info@bscs.org 719.531.5550
Download