Student Evaluation of: - Institutional Effectiveness & Analysis

advertisement
Research in Practice: Using Assessment
to Improve Student Outcomes in General
Education Mathematics
Gail Wisan, Ph.D.
University Director of Assessment
Institutional Effectiveness and Analysis
Florida Atlantic University
Presented at the SAIR 2010 Conference
Southern Association for Institutional Research
New Orleans, LA
September 27, 2010
Some Common Faculty Complaints
About Assessment
1. Paper pushing
2. Dusty reports sit on shelf
3. Nobody even reads reports
4. Improves nothing
5. Has no impact
Perspective/ point of view:
Evaluation Research should drive
outcomes assessment because:



it helps identify what works;
it provides direct evidence;
it helps improve educational
outcomes.
Overview of Presentation:
Benefits/Learning Outcomes
1. Be Able to explain evaluation research;
2. identify the benefits of evaluation research;
3. Able to explain use of experimental and quasiexperimental design evaluation research in
education assessment;
4. Able to apply evaluation research strategies to
outcomes assessment at your institution to
improve student learning outcomes.
The Problem: Student Learning Outcomes
in Gen. Ed. Mathematics
Math Faculty Coordinator interesting in
improving learning outcomes in General
Education math courses… high percentage of
D,W, F grades. (Comparative Data)
 Problem for students, department and
faculty, and university.
Improving Outcomes Assessment:
Research in Practice/
Evaluation Research
The Director of the Mathematics General
Education program and the Director of
Assessment worked together to design a
quasi-experimental design to compare the
effectiveness of different teaching and
learning strategies.


Outcomes Assessment and
Evaluation Research
Outcomes Assessment, at its most effective,
incorporates the tools and methods of
evaluation research.

1. Outcomes Evaluation Research

2. Field Experiment Research
Outcomes Assessment and
Evaluation Research
Outcomes Evaluation Research assesses
the effects of existing programs,
pedagogies, and educational strategies on
students’ learning, competencies, and skills
Outcomes Assessment and
Evaluation Research
Field Experiment Research assesses the
effects of new programs, pedagogies, and
educational strategies on students’ learning,
competencies, and skills
Outcomes Assessment and
Evaluation Research
Outcomes assessment as evaluation research
should facilitate faculty acceptance since it
involves using the tools and methods of
science to improve student learning.
Outcomes Assessment and
Evaluation Research
 Evaluation Research can answer the
question:
How can Assessment Improve Education?
Outcomes Assessment and
Evaluation Research
This presentation describes how evaluation
research assessment is being used to
compare different pedagogies in
mathematics education (Pre-calculus)
to improve student learning
outcomes.
Outcomes Assessment and
Evaluation Research
The Director of the Mathematics General
Education program assigned a mathematics
professor two sections of Pre-Calculus.
 3 hour lecture class
 2 hour lecture and 2 hours of hands on in
the computer lab working problems
Comparison of Outcomes for Two
Teaching/Learning Strategies
2 Hrs Lect./2 Hrs Lab 3 Hrs Lecture Fall ‘09
Fall ‘09 (Instr.
(Instruct. “Smith”)
“Smith”)
Number Enrolled
34
35
Mean Final Grade
2.4
2.3
% A Grade
15%
20%
% B Grade
32%
9%
% C Grade
24%
31%
% D Grade
0%
11%
% F Grade
18%
11%
% W Grade
12%
14%
Comparison of Outcomes for Two
Teaching/Learning Strategies
Number Enrolled
2 Hr. Lect/2 Hr.
Problem Solving
Computer Lab
3 Hr. Lecture
Number Enrolled
34
35
Mean Final Grade
2.4
2.3
B or Above Grade
47%
29%
C or Above Grade
71%
60%
% D, W, or F Grade
29%
40%
Next Math Course Calculus
56%
43%
Next Math Course None
21%
41%
Comparison of Inputs for Students in Two
Classes With Different Teaching/Learning
Strategies
2 Hrs Lect./2 Hrs Lab 3 Hrs Lecture Fall 2009
Fall 2009 (Instr.
(Instruct. “Smith”)
“Smith”)
Number Enrolled
34
35
HS GPA
3.4
3.4
% With HS GPA
100%
91.4%
SAT Math
563
552
SAT Verbal
521
529
ACT Math
23
24
ALEX Math Placement
Score
53.7
54.6
Has Math Placement Score
94%
89%
Comparison of Outcomes for Two
Teaching/Learning Strategies
(Additional Control Group- 2008)
2 Hrs
Lect./2 Hrs
Lab Fall ‘09
(Instr.
“Smith”)
3 Hrs Lecture
Fall ‘09
(Instruct.
“Smith”)
3 Hrs Lecture
Fall ‘08
(Instruct.
“Smith”)
All
Number
Enrolled
34
35
109
178
Mean Final
Grade
2.4
2.3
1.9
2.1
% A Grade
15%
20%
11%
13%
% B Grade
32%
9%
9%
13%
% C Grade
24%
31%
18%
22%
% D Grade
0%
11%
17%
12%
% F Grade
18%
11%
16%
15%
Comparison of Outcomes for Two
Teaching/Learning Strategies
(Additional Control Group- 2008)
Number
Enrolled
34
35
109
178
Mean Final
Grade
2.4
2.3
1.9
2.1
B or Above
Grade
47%
29%
20%
27%
C or Above
Grade
71%
60%
39%
49%
D, F, or W
Grade
29%
40%
59%
49%
Comparison of Inputs for Students in Two Classes With
Different Teaching/Learning Strategies
(Additional Control Group)
2 Hrs
Lect./2 Hrs
Lab Fall ‘09
(Instr.
“Smith”)
3 Hrs Lecture
Fall ‘09
(Instruct.
“Smith”)
3 Hrs Lecture
Fall ‘08
(Instruct.
“Smith”)
All
Number
Enrolled
34
35
109
178
Mean Final
Grade
2.4
2.3
1.9
2.1
HS GPA
3.4
3.4
3.1
3.2
% With HS
GPA
100%
91.4%
94%
95%
SAT Math
563
552
532
542
SAT Verbal
521
529
510
516
Comparison of Inputs for Students in Two Classes With
Different Teaching/Learning Strategies
(Additional Control Group)
2 Hrs
Lect./2 Hrs
Lab Fall ‘09
(Instr.
“Smith”)
3 Hrs Lecture
Fall ‘09
(Instruct.
“Smith”)
3 Hrs Lecture
Fall ‘08
(Instruct.
“Smith”)
All
Number
Enrolled
34
35
109
178
Mean Final
Grade
2.4
2.3
1.9
2.1
SAT Math
563
552
532
542
ACT Math
23
24
22
23
ALEX Math
Placement
Score
53.7
54.6
54.4
54.3
Has Math
94%
89%
84%
87%
Research Design
Examples: Overview
Notation: X, O, R
Experimental Design
Pre-Experimental Design and its problems in
educational research
1. Threats to internal validity (Is X really
having an effect?)
2. Threats to External Validity
(generalizability)
Research Design Examples:
Quasi-Experimental Designs Versus
Pre-Experimental Designs
QUASI- Experimental Designs Better
Answers
1. Better Solutions to internal validity
threats (Is X really having an effect?)
2. Better Solutions to external validity
threats (generalizability)
Notation on Diagrams
An X will represent the exposure of a group to an
experimental variable or teaching method, the
effects of which are to be measured.
O will refer to observation or measurement.
R refers to a random assignment.
Research Design
How Quasi-experimental Design helps to
solve the problems of Pre-experimental
Design
Experimental Designs
Pretest-Posttest Control Group Design:
Random assignment to two groups
R O X O
R O
O
Experimental Designs
Pretest-Posttest Control Group Design
R O X O
R O
O
Sources of Invalidity
External
Interaction of Testing and X
Interaction of Selection and X ?
Reactive Arrangements ?
Experimental Designs
Posttest-Only Control Group Design
R
R
X O
O
Experimental Designs
Posttest-Only Control Group Design
R
R
X O
O
Sources of Invalidity
External
Interaction of Selection and X ?
Reactive Arrangements ?
Pre-Experimental Designs
One-Shot Case Study
X O
Sources of Invalidity
Internal
History
Maturation
Selection
Mortality
External
Interaction of Selection and X
Pre-Experimental Designs
One-Group Pretest-Posttest Design
O X O
Sources of Invalidity
Internal
History
Maturation
Testing
Instrumentation
Interaction of Selection and Maturation, etc.
Regression ?
External
Interaction of Testing and X
Interaction of Selection and X
Reactive Arrangements ?
Pre-Experimental Designs
Static-Group Comparison
X O
O
Sources of Invalidity
Internal
Selection
Mortality
Interaction of Selection and Maturation, etc.
Maturation ?
External
Interaction of Selection and X
Threats to Internal
Validity
History, the specific events occurring
between the first and second measurement
in addition to the experimental variable.
Maturation, processes within the
respondents operating as a function of the
passage of time per se (not specific to the
particular events), including growing older,
growing hungrier, growing more tired etc.
Testing, the effects of taking a test upon the
scores of a second testing.
Threats to Internal
Validity
Instrumentation, in which changes in the
calibration of a measuring instrument or
changes in the observers or scorers used,
may produce changes in the obtained
measurements.
Regression. This operates where groups
have been selected on the basis of their
extreme scores.
Threats to External
Validity
Interaction of Testing and X. A pretest might
increase/decrease the respondent’s
sensitivity or responsiveness to the
experimental variable, making the results
obtained for a pretested population
unrepresentative for the unpretested
universe from which the respondents were
selected.
Interaction of Selection and X
Threats to External
Validity
Reactive Arrangements. This would preclude
generalization about the effect of the
experimental variable upon persons being
exposed to it in nonexperimental settings.
Multiple-X Interference. This is likely to occur
whenever multiple treatments are applied to
the same respondents, because the effects
of prior treatments are not usually erasable.
Threats to Internal
Validity
Selection. There could be biases resulting in
differential selection of respondents for the
comparison groups.
Mortality. This refers to differential loss of
respondents from the comparison groups.
Interaction of Selection and Maturation, etc.,
which in certain of the multiple-group quasiexperimental designs might be mistaken for
the effect of the experimental variable.
Quasi-Experimental
Designs:
Nonequivalent Control
Group Design
O X O
O
O
Quasi-Experimental
Designs:
Nonequivalent Control
Group Design: Comparing
Math Classes Example
O X O
O
O
Quasi-Experimental
Designs
Nonequivalent Control Group Design
O X O
O
O
Sources of Invalidity
Internal
Interaction of Selection and Maturation, etc
Regression ?
External
Interaction of Testing and X
Interaction of Selection and X ?
Reactive Arrangements ?
Comparing Math Strategies: First
Observation/First Test Pre-Calculus
Lecture 2 Hrs./ Hands-On
Computer Lab 2 Hrs.
Lecture 3 hrs.
Mean Grade
58.97
59.44
Median Grade
59
60.5
Lowest Grade
25
15
Highest Grade
100
90
Confidence Level
(95.0%)
6.37
6.59
Examples of Other QuasiExperimental Designs
Time Series
O O O OXO O O O
Multiple Time Series
O O O OXO O O O
O O O O O O O O
Quasi-Experimental
Designs
Time Series
O O O OXO O O O
Sources of Invalidity
Internal
History
Instrumentation ?
External
Interaction of Testing and X
Interaction of Selection and X ?
Reactive Arrangements ?
U.S. Dep’t. of Ed Focuses on
Level of Evidence
U.S. Department of Education highlights
“What Works” in educational strategies;
“What works” is based upon assessment of
level of evidence provided by educational
research: evaluation research
Dep’t. of Education Evaluates
Evidence
General Education & Learning Outcomes Assessment:
The National Context
At the National Symposium on Student Success,
Secretary of Education Margaret Spellings and
others called on colleges to measure and provide
evidence of student learning.
“Measuring Up”-National Report Cards By State:
Little Data on Whether students are Learning
Outcomes assessment has two purposes
• Accountability (standardized national tests?)
• Assessment/Effectiveness
—Are Students Learning? How much?
Performing Assessment as
Research in Practice
Assessment should seek systematic evidence
of the effectiveness of existing programs,
pedagogies, methodologies and approaches
to improve student learning outcomes and
instill a cycle of continuous improvement.
Implementation Strategy: Aim for QuasiExperimental Designs (or Exp. Designs)
Revitalizing Assessment:
Consider these Next Steps
1. Academic Leadership needed –
Work with Academic Coordinators and
Chairs interested in improving
outcomes
2. Encourage Academic Action
Research: Outcomes evaluation
research and field experiments to
compare Learning Outcomes for
different pedagogies
Revitalizing Assessment:
Consider these Next Steps
3. Encourage comparing teaching
strategies when faculty are teaching
more than one section of the same
course
4. Provide Analytic Support for
Academic Coordinators, faculty,
departments Engaged in Evaluation
Outcomes Research
Revitalizing Assessment:
Consider these Next Steps
5. Encourage enthusiasm and
excitement (e.g., faculty mini-grants,
recognition)
6. Communicate and Use Results
Acknowledgements:
Dr. Roger Goldwyn, Director of the Math General
Education Program, Florida Atlantic University, Boca
Raton, Fl
Dr. Kevin Doherty, Database Administrator, Institutional
Effectiveness and Analysis, Florida Atlantic University,
Boca Raton, Fl
QUESTIONS?
Please email
gwisan@fau.edu
Download