What Works in Teaching Science: A Meta

advertisement
What Works In
Teaching Science:
A Meta-Analysis of Current
Research
Carolyn Schroeder, Ph.D.
Center for Math & Science Education
Texas A&M University
Texas A&M University Project Staff
• Timothy P. Scott, Ph.D., Project Director
• Carolyn Schroeder, Ph.D., Senior
Research Associate
• Homer Tolson, Ph.D., Senior Analyst
• Yi-Hsuan Lee, Ph.D., Analyst
• Tse-Yang Huang, Ph.D., Analyst
Advisory Board
• Carol L. Fletcher, Ph.D., Texas
Regional Collaboratives, UT Austin
• Ginny Heilman, Region VI ESC
• Anna McClane, Region IV ESC
• Sandra S. West, Ph.D., Texas State
University
• Jo Ann Wheeler, Region IV ESC
What teaching strategies
have been shown to
improve student
achievement in
science???
Criteria for Selection of Studies
• Dates: 01/01/1980 – 12/31/2004
• Dealt with K-12 science education in the U.S.
• Used student achievement (success, performance,
etc.) as dependent variable
• Used science education teaching strategies as
independent variables
• Was experimental or quasi-experimental
• Reported effect size (ES) or statistics necessary to
calculate it
• Could not be totally correlational
• Could not deal exclusively with special populations
• Could not be included more than once (e.g., same
study reported in a dissertation and journal article)
Acquisition of Studies
• Broad search conducted
• Over 400 potential sources identified
– Journal articles
– Conference papers
– Books
– Dissertations
– Government reports
– Unpublished papers
Search Methods
• Electronic searches
–
–
–
–
–
Web of Science
ERIC (EBSCO, First Search, CSA)
Academic Search Premier
PsycInfo
ProQuest Dissertations and Theses
• Reference lists from previous metaanalyses, books & other articles, electronic
sources (e.g., government sites)
• Request to NARST listserve
• Requests to specific developers of
instructional packages for product studies
Coding of Studies
• Study attributes coded:
–
–
–
–
Citation
Publication type (refereed journal, dissertation, etc.)
Study type (experimental, quasi-experimental, correlational)
Dependent variable (describe test used to measure
achievement)
– Independent variable (describe treatment & control or
alternate treatment)
– Length of treatment/study
– Setting & characteristics
• Schools (#, how selected, public/private, rural/urban, size, %
free lunch)
• Students (#, how selected, how assigned, gender, grade,
ethnicity, SES)
• Teachers ( #, how selected, experience, gender, certification)
– Study results (ES, p, t, F, eta squared, omega squared)
Intercoder Objectivity
• 3 randomly selected articles were coded
independently by senior analyst and 2
researchers
• Degree of objectivity was 90% for two
articles
• Third article was identified as correlational
therefore was not coded
• Senior analyst read & coded all articles,
resolved any differences in coding values
Study Design Classification
• True random assignment of schools/students to
treatment and control groups
• Quasi-experimental with match of schools/students to
achievement and demographics of comparison
school/group
• Quasi-experimental with covariate adjustment for prior
achievement differences
• Quasi-experimental comparison of schools/subjects
based a claim of “similarity”
• Quasi-experimental comparison of schools/subjects to
region, state, or national data
• Quasi-experimental single group pre-post comparison
• Quasi-experimental treatment vs. control pre-posttest
• Quasi-experimental multiple group ANOVA
Treatment Category Classification
Modified from Wise, 1996
•Questioning strategies
•Manipulation strategies
•Enhanced materials strategies
•Testing strategies (changed to Assessment
strategies)
•Inquiry strategies
•Enhanced context strategies
•Instructional media strategies (changed
to Instructional technology strategies)
•Focusing strategies (not used)
•Collaborative learning strategies (added)
Table 1. Frequencies of Characteristics of Included Studies
Independent Variable
Frequencies
Percent (%)
1980 – 1984
6
9.7
1985 – 1989
7
11.3
1990 – 1994
4
6.5
1995 – 1999
15
24.2
2000 – 2004
30
48.4
Refereed Journal Article
40
64.5
Dissertation
18
29.0
Unpublished Report
4
6.5
Experimental (Complete Randomization)
3
4.8
Quasi-Experimental (Randomization Used)
33
53.2
Quasi-Experimental (No Randomization)
26
41.9
Correlational
0
0.0
Publication Year
Publication Type
Type of Study
Table 1. Frequencies of Characteristics of Included Studies
Independent Variable
Frequencies
Percent (%)
Test Content Area
Biology
17
27.4
Chemistry
12
19.4
Physics
5
8.1
Earth Science
7
11.3
Science
21
33.9
Experimental, treatment vs. control
2
3.2
Quasi-Exp. match
1
1.6
Quasi-Exp. similar
1
1.6
Quasi-Exp. single-group pre-post
14
22.6
Quasi-Exp. trt vs. control pre-post
27
43.5
Quasi-Exp. ANOVA
17
27.4
62
100.0
Study Rating
Totals (for each variable)
Table 2. Dependent Variable (Test Type)
Test Type
Number of Cases
Percent (%)
National Standardized-Multiple
Science Content
3
4.8
National StandardizedSingle Science Content
6
9.7
Local StandardizedMultiple Science Content
2
3.2
Local StandardizedSingle Science Content
4
6.5
Other type test
47
75.8
62
100.0
Total
Effect Sizes
• Obtained or calculated for all studies
that met criteria
– n = 62
– one removed later as extreme outlier
• Internal & external validity influences on
effect sizes calculated
• Regression analysis for moderator
variables & dependent variable effect
sizes (n = 61)
• Failsafe N calculated for all categories
Table 3. Failsafe N for Total Data and Treatment
Description Categories
Data
ES
N
Nfs
Overall
.6696
61
756
Questioning Strategies
.7395
3
42
Manipulation Strategies
.5729
8
84
Enhanced Material Strategies
.2908
12
58
Assessment Strategies
.5052
2
19
Inquiry Strategies
.6546
12
145
Enhanced Context Strategies
1.4783
6
172
Instructional Technology Strategies
.4840
15
130
Collaborative Learning Strategies
.9580
3
55
Analysis of Effect Size
• Comprehensive Meta-Analysis® software
from BioStat
• Outputs
– Cohen’s d,
– Hedges’s g,
– Q value,
– confidence interval etc.,
– fixed and random effects, and
– heterogeneity testing results.
Figure 1. Mean Effect Sizes for Treatment
Categories and Total Data
1.6
1.4
C1=Questioning
C2=Manipulation
C3=Enhanced Material
C4=Assessment
C5=Inquiry
C6=Enhanced Context
C7=Instructional Technology
C8=Collaborative Learning
1.2
1
0.8
0.6
0.4
0.2
0
C1*
C2*
C3*
C4*
C5*
C6*
C7*
C8* Total
Mean ES
Conclusions
What teaching strategies have been shown to
improve student achievement in science???
• All of the innovative strategies have a
positive influence on student
achievement.
• Innovative science instruction is a
mixture of teaching strategies.
• Teaching strategies are tools, and the
right tool must be selected for the job
at hand.
Table 4. Ranking of Teaching Strategies
Strategies
Enhanced Context Strategies
Effect Size
Rank
1.4783
1
Collaborative Learning Strategies
.9580
2
Questioning Strategies
.7395
3
Inquiry Strategies
.6546
4
Manipulation Strategies
.5729
5
Assessment Strategies
.5052
6
Instructional Technology Strategies
.4840
7
Enhanced Material Strategies
.2908
8
Most Powerful –
Enhanced Context Strategies
• Make learning relevant to students
• Use real-world examples and problems
– Problem based learning
– Case based learning
• Use technology to bring real world into
classroom
• Take students out of classroom into real
world
• Use multiple contexts to teach concept
Future Research – Meta-Analysis
• Examine studies included in MA to determine how
many of them meet the “strong” or “possible” evidence
of effectiveness standards of the DOE Institute of
Education Sciences (see Identifying and
Implementing Educational Practices Supported by
Rigorous Evidence, available at
http://www.ed.gov/rschstat/research/pubs/rigorousevid/rigorousevid.pdf)
• Broaden scope of meta-analysis to include:
– International studies
– Correlational studies (data on two variables collected and
summarized, showing the relationship between the variables)
– Studies dealing with attitudinal and motivational changes in
students and teachers
– Studies dealing with special populations (English-language
learners, special education, under-represented populations,
etc.)
– Studies dealing with teacher professional development
Products Based
on Results of
Meta-Analysis
Products
• Research-based Teaching
Strategies for Effective Science
Instruction
• Rubric for Analyzing Science
Products
• Combined in booklet – Effective K12 Science Instruction: Elements of
Research-based Science Education
Rubric Design Based on Meta-Analysis
• Science content
– Accuracy and alignment
– Safety
• Organization and
structure
– Format of materials
– Coherency
• Meaningful assessment
–
–
–
–
Alignment
Formative
Summative
Metacognitive
• Effective instructional
practices
– Enhanced context
strategies
– Inquiry strategies
– Instructional technology
strategies
– Collaborative learning
strategies
– Manipulation strategies
– Questioning strategies
• Equity and practicality
– Equity
– Practicality
Rubric Development
• Draft created using criteria
• Sent to advisory board and stakeholders
for comment
• Revision
• Discussion with science teachers/
supervisors
• Further revisions, clarifications, & weighting
of categories
• Field test
• Statistical validation (Interrater reliability = .945
using Cronbach’s alpha)
Questions or Comments?
Booklets may be ordered for
$1.50 each + shipping
Dr. Carolyn Schroeder
979-458-4450
cschroeder@science.tamu.edu
Texas A&M University
Center for Mathematics and Science
Education
3257 TAMU
College Station, Texas 77843-3257
http://www.science.tamu.edu/cmse/tsi
Download