Changing Distributions How online college classes alter student and professor performance

advertisement
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Eric Bettinger, Lindsay Fox, Susanna
Loeb, and Eric Taylor
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Changing Distributions:
How online college classes
alter student and professor
performance
cepa.stanford.edu
• As of 2011, more than 3 in 10 US college students take
at least one course online.
• Colleges are increasing their online presence by
introducing completely online programs.
• These trends are separate from the fascination and
discussion of MOOCs.
• Promise of online education is the reduction in costs.
cepa.stanford.edu
• Little evidence exists on the effectiveness of online
education and on how it changes the dynamics of
traditional learning.
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Online Higher Education
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
(Allen and Seaman, 2013)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Still not much use of MOOCs
cepa.stanford.edu
Continued Growth in
Overall and in Many Offerings
Complete
in 2002
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Growth in Online Programs
cepa.stanford.edu
• Selection
– Students who choose online may differ from those who don’t
• Format/Technology Effects
Access – online at any time from any place
Content Delivery
Interactions
Monitoring
Peers (Bettinger, Loeb, Taylor 2014)
Professors: (e.g. Bettinger and Long, 2010; Carrell and West,
2010; Figlio et al., 2013)
cepa.stanford.edu
–
–
–
–
–
–
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Why Might Online Results for
Students Differ?
Xu and Jaggars (2011, 13) – negative on course completion & grades
• employ propensity score matching for introductory English and Math courses in Virginia community
colleges (2011) and a distance IV in courses in in WA community colleges (2013) - - 6.0 percentage
points on completion, -.34 on grades
–
Hart, Freidman, Hill 2014 – negative effects in California community college
• paper uses a series of fixed effects models, including college-course fixed effects and individual
fixed effects (~8.4 percentage points less likely to complete)
–
Streich, 2013 – negative effects on passing in community college
• Student and course fixed effects and an instrument based on the percent of course seats offered
online – students are 7.5 - 11 percentage points less likely to pass an online class. The IV estimate
falls 8.3 percentage points.
–
Figlio et al., 2013 – online scores are worse for subgroups (males, Hispanics, and lower-achieving
students)
• Random assignment into economics course at U Florida
–
Bowen, Chingos, Lack, Nygren, 2014 – No detectable difference between hybrid and traditional courses
• Random assignment at 6 public institutions
–
Streich, 2014 - Maybe positive employment effects
• compares pre and post college employment for older students.
• Benefit largely in years immediately following initial enrollment; student may still be enrolled.
Earnings fall less during enrolled periods for students who enroll in online classes. Large fixed
benefit but an insignificant dosage effect in long run
cepa.stanford.edu
–
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Small Research Literature
• Particularly interesting group
• Large and growing
– Nearly 2.4 million undergraduate students (full-time equivalent)
were enrolled at a for-profit college or university in the 2011-12
academic year.
– 18 %of all associate degrees (Deming, Goldin, and Katz 2012).
• Primarily Non-Traditional Students
– 30% previously attended college; 70% are starting college for the
first time
– 25% of prior enrollees are “churning” and have attended at least two
other institutions before coming to the for-profit sector.
– ~ 40% of all for-profit college students transfer to another college.
cepa.stanford.edu
• The look like they are experimenting with for-profits as a destination or to
add a course for credit
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
For-Profits
• Controversial
– Recruiting practices, Use of Federal Financial Aid (GAO 2010)
– Gainful Employment regulations require institutions to show graduates meet
strict income-to-debt ratios and loan repayment rates to maintain eligibility
(Federal Register 2010)
• Small Literature of effects –
– Cellini and Chaudhary, 2012 –
• students who enroll in associate’s degree programs in for-profit colleges
get earnings gains between 6 and 8 percent - 95% confidence interval
range from -2.7 to 17.6 percent – not different than public community
colleges.
• Students who complete associate’s degrees in for-profit earn ~22 percent,
or 11 percent per year; some evidence that this is higher than returns for
public sector graduates.
• Resume audit study suggests negative impacts
cepa.stanford.edu
– Deming et al, 2014
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
For-Profits
• Prediction 1: Online students will be very different from
students taking courses on campuses
– Not really true
• Prediction 2: Most students who enroll online will do so
exclusively
– No
• Prediction 3: Students will take classes online at distant
universities
– Median is only 15 miles from students’ home
– At all types of institutions online enrollment accounts for about 15 percent of total
cepa.stanford.edu
• Prediction 4: For-profit higher education institutions will
consist primarily of online education
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Some Myths/Facts on Online Education
(From Doyle, 2009)
• What’s new
– Compares Online with In-Person Courses (only small literature almost all on
community colleges)
– For-Profit Institution
– Larger Scale with opportunities for identification
– Assesses effects on variation in performance as well as means
• Standardization vs. Monitoring
• Research Questions
cepa.stanford.edu
1. How do online courses affect students’ performance in terms of course
completion, grades and later enrollment?
2. How do online courses affect the variance of student performance?
3. How do online courses affect the variance in professor effectiveness?
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
This Study
• One of the largest for-profit institutions:
–
–
–
–
2011-12 DeVry enrolls over 100,000 undergraduates and 30,000 graduate students, and employs over
5,500 professors.
~5 percent of all for-profit undergraduates.
ranks 7th in enrollment among for-profit colleges.
The largest institution, University of Phoenix, enrolls just over 10 percent and the 15th largest enrolls 1
percent; collectively the top 15 enroll 52 percent of students at for-profit colleges.
• Began primarily as a technical school in the 1930s, but
now most of the University’s students major in business
management, technology, health, or some combination;
and 80 percent are seeking a bachelor’s degree. .
– Geographic representativeness helps with identification
cepa.stanford.edu
• Approximately 100 local campuses around the United
States where about 1/3 of courses take place. The other
2/3 of courses are conducted online.
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
DeVry University
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
• All DeVry University undergraduate course
enrollments May 2009 through November 2013
– Can only identify professors: May 2012 - November 2013 term
• RQ1 (effects): Full Sample (n>281,000 students)
• RQ2 (student variance): Only Students who took
both online and live
• RQ3 (professor variance) Only professors who
taught both
• Outcomes
cepa.stanford.edu
– measure student success using course grades
• Each professor assigns traditional A-F grades, which we convert to the standard
0-4 point equivalents
– measure student persistence in the subsequent semester and the semester one year
later
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Sample
Observations
Student-course-session
Students
Professors
Courses
Sections
Online
(2)
1,947,952 1,163,563
197,158
160,198
5,287
2,588
744
555
158,458
60,463
In-person
(3)
784,389
99,636
3,366
649
97,995
0.464
31.238
0.541
33.102
0.349
28.474
Online
0.597
1
0
cepa.stanford.edu
Student characteristics
Female
Age
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
All
(1)
Failed any course
Withdrew from any course
Student course outcomes
Grade (0-4)
A- or higher
B- or higher
C- or higher
Passed
Student persistence outcomes
Enrolled next semester
Credits attempted
next semester
Online
In-person
2.927
(0.931)
0.279
0.261
2.966
(0.935)
0.271
0.258
2.870
(0.923)
0.291
0.264
2.818
(1.316)
0.403
0.693
0.833
0.888
2.801
(1.343)
0.408
0.689
0.825
0.880
2.842
(1.275)
0.394
0.701
0.846
0.901
0.887
9.756
(4.745)
0.880
9.110
(4.646)
0.897
10.677
(4.734)
0.708
7.555
(5.765)
0.703
6.719
(5.498)
0.715
8.750
(5.925)
cepa.stanford.edu
Enrolled semester one year later
Credits attempted
semester one year later
All
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Student prior academics
Cumulative GPA
• Basic Model:
=
+
, <
+
+
+
+
(
)
• Outcome is a function of taking course online, prior gpa,
observable student characteristics (gender, age), course
fixed effects, time (27 terms), home campus indicators.
• Assign home campus based on the physical distance
between the student’s home address and the local campus
addresses
cepa.stanford.edu
• Cluster standard errors at section level
+
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Methods – RQ1
• But, choice of online/live non-random
– Students who would do better may take one or the other;
students who would do better in this particular class with one or
other take it
• Use choice arising from idiosyncratic factors
– Distance
– Local in-person offerings
• Instruments
– indicator variable =1 if student i’s home campus b offered
course c on campus in a traditional class setting during term t
• Combined with the home campus fixed effects, ψbict, this limits the
identifying variation to between-term variation within-campuses in
the availability of an in-person option
cepa.stanford.edu
– the distance in miles (Haversine) between student i’s home
address and the nearest local campus. (all students, students
within 30 miles)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Methods – RQ1
Course grade
(A = 4 ... F = 0)
Enrolled Next
Semester
(1)
(3)
(2)
(4)
Enrolled One Year
Later
(5)
(6)
A. Instrument = Course available in person at home campus
Local average treatment effect
First stage: coefficient on
excluded instrument
-0.230***
-0.214***
(0.011)
(0.013)
(0.004)
(0.005)
(0.005)
(0.008)
-0.277
-0.248
-0.264
-0.233
-0.264
-0.233
(0.001)
(0.001)
(0.001)
(0.001)
(0.001)
(0.001)
2.818
(1.316)
-0.027*** -0.043*** -0.031*** -0.066***
0.851
0.671
cepa.stanford.edu
Sample mean (st. dev.) for dep.
var.
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Results – RQ1
Course grade
(A = 4 ... F = 0)
(1)
(2)
Enrolled Next
Semester
(3)
(4)
Enrolled One Year
Later
(5)
(6)
B. Instrument = Distance to home campus (excluding students more than 30 miles away)
Local average treatment effect
0.034
-0.098***
(0.030)
(0.023)
(0.007)
(0.006)
(0.010)
(0.010)
First stage: coefficient on
0.007
0.010
0.007
0.010
0.007
0.010
excluded instrument
(0.000)
(0.000)
(0.000)
(0.000)
(0.000)
(0.000)
2.823
(1.304)
0.857
-0.074*** -0.128***
0.681
cepa.stanford.edu
Sample mean (st. dev.) for dep. var.
-0.048*** -0.072***
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Results – RQ1
1. Availability might be predictable. Some campuses
offer it using seemingly consistent patterns.
– Alternative instruments varying availability
– Robust to different instruments
2. Distributional effects
cepa.stanford.edu
– Grade distribution is discrete
– Effects are throughout the distribution and stronger at top
– (In distance instruments, effects appear stronger at the
bottom)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Robustness and Extensions
Grade: At
least B-
Grade: At
least C-
Passed
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Grade: At
Grade
least APanel B. IV using Availability (current session)
Withdrew
Online
-0.201*** -0.059*** -0.057*** -0.047*** -0.038*** 0.024***
(standard error)
(0.008)
(0.003)
(0.003)
(0.002)
(0.002)
(0.002)
Panel C. IV using Availability (current, next, prior sessions)
Online
-0.230*** -0.071*** -0.067*** -0.052*** -0.040*** 0.019***
(standard error)
(0.011)
(0.005)
(0.004)
(0.003)
(0.002)
(0.002)
Panel D. IV using Availability (current, next 1 or 2, prior 1 or 2 sessions)
Online
(standard error)
-0.241*** -0.074*** -0.071*** -0.054*** -0.042*** 0.017***
(0.013)
(0.006)
(0.005)
(0.004)
(0.003)
(0.003)
Panel E. IV using Availability (current, next 1, 2 or 3, prior 1, 2, or 3 sessions)
cepa.stanford.edu
Online
(standard error)
-0.240*** -0.075*** -0.071*** -0.053*** -0.041*** 0.016***
(0.014)
(0.006)
(0.005)
(0.004)
(0.003)
(0.003)
•
Outcomes are unambiguously worse.
•
Grades are lower.
•
Future enrollments are less intense and less likely to happen.
•
Who are compliers?
Proportion
female
(2)
Mean
age
(3)
Compliers: will take the course in-person
if offered, otherwise will take the course online
2.999
0.441
31.572
Always-takers: always take the course online
2.871
0.542
32.673
Never-takers: always take the course in-person
2.953
0.338
28.074
cepa.stanford.edu
Mean prior
GPA
(1)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Overview of RQ1. Relative Impact of Online Courses
on Student Outcomes
• Again, if students randomly assigned to take a course
online or in-person, the parameter of interest could be
estimated by:
•
• Here we just use controls: estimating γ using an
extension our basic model for RQ1:
=
+
+
, <
+
+
+
(1 −
+
+
)+
,
(
)
2
with the assumption
~ (
,
).
• are random effects. use maximum likelihood. sample
includes students who take both.
cepa.stanford.edu
,
2
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Methods - RQ2 (variance of student outcomes)
Sample
st. dev.
course
grade
Online Studentin-person P-value
course
diff.
test
observatio
(γ)
ns
diff. = 0
(2)
(3)
(4)
Instrumental variables estimate
2.818
0.296
0.000
1,947,952
Random effects estimate
2.818
0.106
0.000
1,947,952
Random effects estimate (only
students who took courses both
online and in-person)
2.844
0.282
0.000
1,023,610
cepa.stanford.edu
(1)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Variance in Student Outcomes
Credits
Next Year
Panel A. IV model (all students)
S.D. Difference Online vs. Brick
Standard error
0.296
[0.000]
0.723
[0.328]
-0.401
[0.846]
N (student-by-course)
1947952
1661933
1310420
Panel B. Random Coefficients Model (all students)
S.D. Difference Online vs. Brick
Likelihood Ratio Test: Difference=0 (p-value)
0.106
[0.000]
-0.167
[0.000]
-0.062
[0.000]
N (student-by-course)
Panel C. Random Coefficients Model (only
students taking both)
S.D. Difference Online vs. Brick
Likelihood Ratio Test: Difference=0 (p-value)
1947952
1661933
1310420
0.282
[0.000]
0.085
[0.000]
0.341
[0.000]
N (student-by-course)
1023610
913559
752289
cepa.stanford.edu
Grade
Credits
Next
Semester
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Variance in Student Outcomes
• Similar approach: replace the student random
effects with random effects for professors, μjict,
and for course sections, θsict.
=
+
+
+
(
(
, <
+
)
)
+
(
+
+
(
) (1
) (1
−
+
+
(
)
)
−
)+
,
2
̅
~ ( ̅
̅
̅
,
,
0 0
0 0
).
2
,
2
cepa.stanford.edu
with the assumption
2
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Methods – RQ3 (variance of professor effects)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Variance in Professor Effectiveness
Student-course observations 116301
71723
71723
cepa.stanford.edu
A.
B.
Professors who teach both
All professors
in-person and online
Course
Enrolled Course
Enrolled
grade Enrolled One
grade Enrolled One
(A = 4 Next
Year
(A = 4 Next
Year
... F = 0) Semester Later
... F = 0) Semester Later
(1)
(2)
(3)
(4)
(5)
(6)
Standard deviation of professor
effects
In-person classes
0.310 0.024 0.044
0.330 0.023 0.044
Online classes
0.294 0.018 0.030
0.270 0.020 0.019
In-person = online test pvalue
[0.244] [0.255] [0.026] [0.000] [0.354] [0.000]
470201 303390 303390
Grade: Grade: Grade:
At
At
At
least least least
Grade ABC- Passed
Panel A. Professors teaching both inperson and online
Brick (standard deviation)
0.310 0.127 0.102 0.062 0.041
Online (standard deviation)
0.294 0.111 0.086 0.064 0.049
Likelihood Ratio Test:
Difference=0 (p-value)
0.244 0.002 0.001 0.562 0.006
N (student-by-course)
Credits Enrolle
Next d Next Credits Enrolle
Withdr Semest Semest Next d Next
ew
er
er
Year Year
0.025 0.588 0.024 0.563 0.044
0.025 0.371 0.018 0.385 0.030
0.988 0.000 0.255 0.021 0.026
116301116301116301116301116301 127141 61968 71723 49985 71723
0.330 0.131 0.106 0.070 0.049 0.028 0.589 0.023 0.665 0.044
0.270 0.105 0.082 0.055 0.039 0.025 0.309 0.020 0.346 0.019
cepa.stanford.edu
N (student-by-course)
Panel B. All Professors
Brick (standard deviation)
Online (standard deviation)
Likelihood Ratio Test:
Difference=0 (p-value)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Variance in Professor Effectiveness
0.000 0.000 0.000 0.000 0.000 0.011 0.000 0.354 0.000 0.000
47020 47020 47020 47020 47020
25639 30339 20390 30339
1
1
1
1
1 519350 1
0
6
0
Dependent variable
Course grade
(A = 4 ... F = 0)
(1)
Online
(2)
Enrolled Next
Semester
(3)
(4)
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Heterogeneity in Impacts by Prior Achievement
Enrolled One Year
Later
(5)
(6)
A. Instrument = Course available in person at home campus
0.229*** 0.214*** 0.027*** 0.043*** 0.031*** 0.066***
(0.011)
(centered at sample mean)
(0.032)
(0.004)
(0.005)
(0.005)
(0.008)
0.101***
-0.012
0.001
-0.008
0.002
(0.027)
(0.009)
(0.008)
(0.006)
(0.006)
cepa.stanford.edu
Online * prior grade point
avg.
0.091**
(0.013)
• Students perform consistently worse in online courses as
measured by academic performance and retention
• Variance in student performance greater online
• But not driven by greater variance in professor performance –
actually less variance across professors
To see professor effects…
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Summary of Key Findings
cepa.stanford.edu
0
2
Density
4
6
8
COLL148
Professor Effects : Sum of Quiz Scores and Final
-.6
-.4
.4
Mean Student Score =0.142
Estimated 'Value-Added' =0.075
AFA Calculus 1 =0.05
AFA Calculus 2 / K-12 Math =0.14
.6
cepa.stanford.edu
by Professor
-.2
0
.2
Student Standard Deviations
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Differences Between Professors
0
5
Density
10
15
20
25
PSYC110
Professor Effects : Sum of Quiz Scores and Final
-.6
-.4
.4
Mean Student Score =0.137
Estimated 'Value-Added' =0.042
AFA Calculus 1 =0.05
AFA Calculus 2 / K-12 Math =0.14
.6
cepa.stanford.edu
by Professor
-.2
0
.2
Student Standard Deviations
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Differences Between Professors
• Student accountability and responsibility might explain
the increased variance.
– Figlio (2013) found that students in online settings were
more likely to procrastinate.
– Design of courses could improve
• DeVry embedded student incentives to participate in the course prior
to our analysis
– Motivated students likely succeed. Procrastinating students
drive failure.
• Decreased professor variance can be both good and bad.
cepa.stanford.edu
– Great teaching and bad teaching are stifled.
– Conformity increases.
– In other analysis, we can examine what percentage of time
faculty use the exact same language as other faculty –
evidence of conformity.
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Potential Implications
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Professors Consistent Behavior
0
7
14
21
28
35
Day of Course
42
49
56
cepa.stanford.edu
Fraction Logged In On or Before (cumulative)
0 .1 .2 .3 .4 .5 .6 .7 .8 .9 1
COLL148
Discussion Threads : Professor Behavior as Predictor
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Conformity in Speech, the Good
cepa.stanford.edu
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Conformity in Speech, the Bad
cepa.stanford.edu
• Contexts seem to matter
• Some evidence that conformity to “bad” modeling
has negative outcomes:
– One post in one course frequently copied by other faculty had significant typos
(e.g. acronym was spelled “Accronmy”)
– Students who were randomly assigned to professors who perpetuated this
misspelling were 1% less likely to be enrolled the next semester.
– Some professors corrected themselves, and their pre-correction conformity led
to 3.7% decline in enrollment the next semester.
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Conformity
cepa.stanford.edu
• Some failure may be “good”
– Online courses may facilitate new enrollments, and we cannot
distinguish whether the overall increase in new enrollments
compensates for reduced effectiveness.
– Online courses generate some cost savings due to less expensive
faculty and decreased overhead. Costs savings may compensate for
decreased effectiveness.
– Future studies can shed light on this.
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
Valuation of Failure
cepa.stanford.edu
Download