Maine 2003 - Courseweb - University of North Texas

advertisement
Conducting Scientifically-Based
Research in Teaching with Technology,
Part I
SITE Annual Meeting Symposium
Atlanta, Georgia
Gerald Knezek & Rhonda Christensen
University of North Texas
Charlotte Owens & Dale Magoun
University of Louisiana at Monroe
March 2, 2004
Our History of
Scientifically-Based Research
• Foundation: More than ten years of
instrumentation development/validation
• Research based on dissertation criteria
• Large data sets analyzed (replication of
findings)
• Quantitative to tell us what is happening;
Qualitative to tell us why it is happening
Components for Evaluation with a
Research Agenda
• Plan for Evaluation (when writing the grant not after you get it)
• Use reliable/valid instruments and/or
• Work on developing instruments the first
year
• Get baseline data - how can you know how
far you have come if you don’t know where
you started
• Use comparison groups such as other PT3
grantees
Common Instruments
• Stages of Adoption of Technology
• CBAM Levels of Use
• Technology Proficiency Self
Assessment
• Teachers Attitudes Toward
Computers (TAC)
Online Data Acquisition System
•
•
•
•
Provided by UNT
Unix/Linux Based
Stores Data in Files
Data Shared with Contributors
Why are we gathering this data?
• Campbell, D. T. & Stanley, J. C. (1966).
Experimental and Quasi-Experimental Designs for
Research on Teaching. From Gage, N. L. (Ed.)
Handbook of Research on Teaching. Boston: Rand
McNally, 1963.
Frequently references:
• McCall, W. A. (1923). How to Experiment in
Education.
Adding Research Agendas to
Evaluation
• ‘By experiment we refer to that portion of research
in which variables are manipulated and their
effects upon other variables are observed.’
(Campbell & Stanley, 1963, p. 1)
• Dependent = outcome variable; predicted or
measured; we hope this “depends on’ something
• Independent = predictor variable; one manipulated
to make, or believed to make a difference
• Did changing x influence/impact/improve y?
• Y = f(x)
Longitudinal Designs
• PT3/Univ. of North Texas: 1999-2003
Baseline data year 1
Pre-post course measures over multiple years
Trends in exit student survey data
• PT3/University of Nevada/Reno: 2003-2006
Best features of UNT plus comparisons w/UNT
Added random selection of 30-60 teachers to track
retention through end of induction year
Stages of Adoption of
Technology
Fall 1998
6
5.2
5
4.13
3.9
4
4.1
4.4
Pretest
3.1
Pos t Tes t
3
2
1
0
Typic al
Teacher
(n=1141)
CECS 1100
Students
CECS 3440 CECS 4100
Students
Students
CECS 4100 Technology Skills
Pre and Post - Spring 1999
5.84
6
5.6
5.39
5.5
5.56
4.93
5
4.5
4.72
4.54
4.5
4.03
3.96
4
3.51
pre-test
3.5
2.91
3
2.5
2
1.5
1
E-mail
WWW
Integrated
App.
Tech. in
Teac hing
Multimedia
Skills
Web skills
post-tes t
What is the ‘Experiment’ here?
• Dependent variables: Email, WWW,
Integrated Applications, Teaching with
Technology Competencies
• Independent Variable: completion of content
of course (CECS 4100, Computers in
Education)
Longitudinal Trends in Integration
Abilities
(Research Item)
Stages of Adoption: CECS 4100 (Computers in
Education) Univ. of North Texas
6
5
4
3
2
1
0
Pre
Post
Fall 2001
ES
Pre
Post
Spring 2002
ES
Pre
Post
Fall 2002
ES
Pre
Post
ES
Spring 2003
Stage 1: Aware ness
I am aware that technology exists but hav e no t used it - perhap s I'm even avo iding
it .
Stage 2: Learning the proce ss
I am currently trying to learn the basics. I am often frustrated using computers.
I lack confidence when using compu ters.
Stage 3: Understandi ng and application of the process
I am beginning to under stand the process of us ing techno logy and can think o f
specific tasks in which it mi ght be use ful.
Stage 4: Famili arity and confidence
I am gaining a sense of con fidence in u sing the comput er fo r specifi c tasks.
I am starting to feel comf ortable using the comput er.
Stage 5: Adaptation to other contexts
I think about the computer as a tool to help me and a m no longe r conce rned about
it as techno logy. I can use it in many appli cations and as an instructiona l aid.
Stage 6: Crea tive application to new contexts
I can apply wha t I kno w about techno logy in the c la ssroom. I am able to use it as
an instructiona l tool and integrate it into the curriculum.
From: Christensen, R. (1997). Effect of technology integration education on the attitudes of teachers and
their students. Doctoral dissertation, University of North Texas. Based on Russell, A. L. (1995)
Stages in learning new technology. Computers in Education 25(4), 173-178.
Growth in Technology Integration
Course at Univ. of North Texas
(Typical PT3 Evaluation Item)
CECS 4100 Enrollment Spring '99 - Spring '03
Enrollment Numbers
200
154
150
167
Spring
Sum I
Sum II
Fall
129
102
100
50
52
46
21 20
52
47
16 12
0
99
00
47
37
21
0
0
01
02
Year
03
Data Sharing with PT3 Projects
• Control groups are difficult
• Comparisons within CE clusters is easy!
• Similar trends are positive confirmations for
each other
Spring 2002: Snapshot Data
•
•
•
•
•
•
Univ. North Texas
Texas A&M Univ.
St. Thomas of Miami
Univ. Nevada at Reno
Northwestern Oklahoma State Univ.
Wichita State University (Kansas)
Demographics Spring 2002
• 481 subjects from 5 schools for pretest
–
–
–
–
–
–
UNT = 179
TAMU = 65
Miami = 14
Nevada = 91
Oklahoma = 95
Wichita St. = 37
• 157 subjects from 3 schools for post test
– UNT, TAMU, St. Thomas (2 times)
Demographics Spring 2002
(cont.)
• Age: Wichita State students are older
– Mean = 28 years
• Gender: UNT & TAMU have more females
– 85% and 97%
• Graduation: UNT, Nevada, Oklahoma
students expect to graduate later
• Teaching Level: TAMU students Elem.
Educational Technology Preservice
Courses
Fall 2002 Pre and Post - UNT and UF
6
5
UNT Pre
4
UF Pre
UNT Post
UF Post
3
2
1
0
Stages
CBAM
Tpemail
TPWWW
TPIntegrated
Apps
TP Teach
with Tech
Educational Technology
Preservice Courses
Spring 2003 Pre and Post - UNT and UF
6
5
4
UNT Pre
UF Pre
3
UNT Post
UF Post
2
1
0
Stages
CBAM
Tpemail
TPWWW
TPIntegrated TP Teach with
Apps
Tech
What is the ‘Experiment’ here?
• Dependent Variable: Gains in technology
integration proficiency
• Independent Variables:
– Completion of course content (as before)
– Comparisons/contrasts among different
environments/curricular models (value added)
General Findings
• Reliability of Stages is High
– (r = .88 test-retest)
• Reliability of Skill Self-Efficacy Data is
High
– (Alpha = .77 to .88 for 4 TPSA scales)
• Gender: Females are higher in Web Access,
Home Computer Use, and WWW Skills
Spring 2002 Pretest - Six PT3 Sites
5.5
5
4.5
UNT-Pre
UNT-Post
TAMU
St. Thomas
UNReno
Wichita State
NWOSU
4
3.5
3
2.5
2
1.5
1
Stages
TP-email
TP-WWW
TP-IA
TP-TT
Pre-Post Trends for TAMU:
Two Teacher Preparation Courses
6
5
4
TAMU22Pre
TAMU22PST
3
TAMU21Pre
TAMU21Ps t
2
1
0
CBAM
Stag es
Stag es2
TPSA-IA
TPSAEmai l
TPSA-TT
TPSAWWW
Impact Across 2 Schools (PrePost, UNT & TAMU)
•
•
•
•
•
Stages: ES = .42 to .76
CBAM LOU: ES = .73 to 1.15
TPSA-IA: ES = .18 to .82
TPSA-TT: ES = .33 to 1.12
TPSA-WWW: ES = .05 to .49
How to Interpret Effect Size
• Cohen’s d vs. other
• Small (.2), medium (.5) vs. large (.8)
• Compare to other common effect sizes
– “As a quick rule of thumb, an effect size of 0.30 or greater is
considered to be important in studies of educational
programs.” (NCREL)
– For example .1 is one month learning (NCREL)
– others
SRI International. http://www.ncrel.org/tech/claims/measure.html
APA Guidelines for Effect Size
The Publication Manual of the American Psychological
Association (APA, 2001) strongly suggests that effect size
statistics be reported in addition to the usual statistical
tests. To quote from this venerable guide, "For the reader
to fully understand the importance of your findings, it is
almost always necessary to include Some index of effect
size or strength of relationship in your Results section"
(APA, 2001, p. 25). This certainly sounds like reasonable
advice, but authors have been reluctant to follow this
advice and include the suggested effect sizes in their
submissions. So, following the lead of several other
journals, effect size statistics are now required for the
primary findings presented in a manuscript.
UNR Collaborative Exchange
New PT3 Project
• Univ. of Nevada - Reno is lead and IITTL
at UNT as outside evaluator
• One component - following teachers after
they graduate from the teacher ed.
Program
• Randomly select from a pool of 2004
graduates and contact them prior to
graduation; pay a stipend to continue in the
project by providing yearly data
Procedure for Unbiased Selection
• Locate prospective graduates to be certified to
teach during spring 2004
• Number consecutively
• Use random number table to select a preservice
candidate from the list
• Verify student completed technology integration
course with B or better
• Invite preservice candidate to participate during
induction year and possibly beyond
• Repeat process until 60 agree to participate
From Edwards, A. L. (1954). Statistical Methods for the
Behavioral Sciences. NY: Rinehart.
Maine 2003
Maine Learning Technology
Initiative (MLTI)
• 2001-2002 Laptops for all 7th graders
• 2002-2003 Laptops for all 7th and 8th
graders in the whole state of Maine
• Maine Learns is About Curriculum
Interesting Aspects of Research
• Sample or Population (all 17,000 students in
the state)
• Selection of Exploratory Schools (if wished
to participate, one from each region)
• Statistical measures of significance
• Strong reliance on Effect Size
Research Design
•
•
•
•
9 Exploration schools (1 per region)
Compared with 214 others
Used 8th grade state-wide achievement
Examined trend over 3 years in math,
science, social studies, and
visual/performing arts
• Intervention -
– Extensive teacher preparation
– Laptop and software for every 7th-8th teacher/student
– Some permitted to take home, others not
2003 Findings
• Evaluators’ reports
• Achievement Effect Sizes
• Student self reports on
– Attitudes toward school
– Self Concept
• Serendipitous findings are the sometimes the
most valuable
– Home Access
– Gender Equity
MEA 2000 -2001 : Group 1 = 9 Exploration Schools, Group 2 = All
Others
Effect Size
Group Statistics
(Cohen's D)
GROUP
N Mean
Std. Dev
Science
1
9
529 .11
3.95
0.05
2 204
528 .90
4.25
SocStud
1
9
531 .33
4.39
-0.12
2 204
531 .89
4.54
Math
1
9
527 .78
3.87
0.03
2 204
527 .61
5.03
VPArts
1
9
531 .00
5.59
0.06
2 204
530 .65
5.52
MEA 2001 -2002 : Group 1 = 9 Exploration Schools, Group 2 = All
Others
Group Statistics
Effect Size
(Cohen's D)
GROUP
N Mean
Std. Dev
Science
1
9
529 .56
3.84
0.44
2 214
527 .67
4.27
SocStud
1
9
529 .44
4.36
-0.06
2 214
529 .76
5.20
Math
1
9
527 .78
6.61
0.21
2 214
526 .59
5.72
VPArts
1
9
530 .33
4.72
0.11
2 213
529 .67
6.10
MEA 2002 -2003 : Group 1 = 9 Exploration Schools, Group 2 = All
Others
Group Statistics
Effect Size
(Cohen's D)
GROUP
N Mean
Std. Dev
Science
1
9
529 .00
3.43
0.22
2 211
528 .03
4.52
SocStud
1
9
531 .44
3.32
0.02
2 211
531 .35
5.41
Math
1
9
528 .44
3.88
0.22
2 211
527 .37
4.94
VPArts
1
9
531 .67
4.50
0.22
2 211
530 .37
6.08
0.50
0.40
0.30
0.20
2000-2001
2001-2002
2002-2003
0.10
-0.20
VP
A
M
at
h
-0.10
So
ci
al
St
ud
ie
s
0.00
Sc
ie
nc
e
MLTI 9 Project School Scores vs. 200 Other Maine
Middle Schools, in Standard Deviation Units
Effect of Maine Learning Technology Initiative
2000 - 2003
Would Cohen Have Predicted
This Effect?
"Small Effect Size: d = .2. In new areas of research inquiry,
effect sizes are likely to be small (when they are not zero!).
This is because the phenomena under study are typically
not under good experimental or measurement control or
both. When phenomena are studied which cannot be
brought into the laboratory, the influence of uncontrollable
extraneous variables ("noise') makes the size of the effect
small relative to these (makes the 'signal' difficult to
detect).” Cohen, J. (1977), p. 25.
Exploratory - as Illustrated by:
Impact of Computer Access Restricted to
School - Maine 7th Graders June 2003
5
4.5
4
3.5
3
2.5
2
1.5
1
0.5
0
No Access Outside School
Take Home Laptop and/or
Other Home Access
Effect Size
CAQ
Attitude
Toward
School
CAQ Self
Concept
CAQ Email
Skill
CAQ Total
Skill
Contrast with Louisiana Confidence
Intervals
(Teacher Perceptions of Impact)
4.0
3.5
3.0
2.5
Mean
2.0
95%CIUpper
95%CILower
1.5
MEAN
Lis tngSkill
Mus icInters t
MathSkills
AREA
Pos tiveLrng
Pos EdEffect
ReadngSkill
Teachers' Perception of Usefulness of ARTS to the Delta for Math and Rea
vs. Fostering Interest in Music, Learning, or Education in General
N
22
22
22
22
22
22
22
22
22
22
Math Mean Math SD Music
SD
t
2.41
1.05
3.09
2.41
1.87
Postive Learning Experience
2.41
1.05
3.05
1.33
1.77
Positive Effect on Education
2.41
1.05
2.95
1.4
1.45
Signif
0.068
not quite significant
0.0837
not quite significant
0.1552
not statistically significant
Reading Reading SD Music
SD
t
2.32
1
3.09
1.34
2.16
Postive Learning Experience
2.32
1
3.05
1.33
2.06
Positive Effect on Education
2.32
1
2.95
1.4
1.72
Signif
0.0365
statistically significant
0.0459
statistically significant
0.0932
not quite significant
Math Skill s vs. Music Interest
P value and statistical significance:
ÊÊ
The two-tailed P value equals 0.0680
ÊÊ
By conventional criteria, this difference is considered to be not quite
statistically significant.
Confidence interval:
ÊÊ
The mean of
Group One minus Group Two equals -0.6800
ÊÊ9
5% confidence interval of this difference: From -1.4125 to 0.0525
Intermediate values used in calculations:
ÊÊ
t = 1.8735
ÊÊ
df = 42
ÊÊ
standard error of difference = 0.363
Source: Graphpad Quickcalcs. Free Online Calculators for Scientists.
Graphpad.com. Retrieved February 27, 2004.
It’s all About Confidence
As shown in Figure 1, three of the measures’ 95% confidence intervals
… are roughly 3/4 of a confidence interval band above … that is, no
more than 1/4 of the 95% confidence interval range overlaps from
the upper to the lower group. Differences in this range are as a ruleof-thumb “meaningful” according to emerging APA guidelines, and
roughly comparable to a p = .05 level of significance (Cumming,
2003). The effect size for the combined upper three versus the lower
two is approximately [((3.09+3.05+2.95)/3) – ((2.32+2.41)/2]/
((1.34+1.33+1.40+1.00+1.05)/5) = (3.03 – 2.37) / 1.22 = .66 / 1.22 =
.54, considerably larger than the .30 cutoff beyond which technology
interventions are considered meaningful (Bialo & Sivin-Kachala,
1996). Teachers rated the ARTS to the Delta class as much more
useful for promoting interest in music and creating a positive effect
on students’ overall education experience that for improving reading
and math skills.
Download