Online Homework and Student Performance: Evidence

advertisement
Online Homework and Student
Performance: Evidence from a Natural
Experiment
Karen Menard, Bridget O’Shaughnessy and
Abigail Payne
Online Homework and Student Performance:
Evidence from a Natural Experiment
Karen Menard**
Bridget O’Shaughnessy*
Abigail Payne*
With Olesya Vovk* and Katherine Swierzewski*
May, 2014
Preliminary Draft
Do not cite
Abstract:
Recent research has shown that online homework in economics courses can provide a small but
statistically significant improvement in student achievement. During the fall 2010 semester, it
was brought to the attention of instructors at McMaster that the provincial government's policies
regarding ancillary fees at universities prohibit the mandatory use of online resources linked to
students' grades. In Introductory Macroeconomics, the enforcement of the policy had a dramatic
impact on how the course was delivered. Prior to fall 2010 online homework, delivered through a
website called "Aplia", comprised twenty percent of students' grades. Access to the website came
free with the purchase of a new text book. Students who did not wish to purchase a new hard
copy book could purchase access to the Aplia website which included an electronic copy of the
text. For the winter 2011 semester, the online homework component was dropped, and students
were evaluated using only mid-term tests and a final exam. During the 2011-2012 academic year,
students had the option of doing online homework for grades, or just having their grade made up
of the tests and exam. The enforcement of the provincial policy regarding the use of online tools
provides a natural experiment that can be used to study the effectiveness of such tools - over
5000 students from across campus are involved in the study. Preliminary results indicate that the
online homework improves student learning outcomes, particularly for weaker students.
*Department of Economics and Public Economics Data Analysis Laboratory (PEDAL),
McMaster University
** Ontario Institute for Cancer Research
Online Homework and Student Performance: Evidence from an Online
Experiment
1. Introduction
With class sizes ever increasing, and teaching resources constrained, instructors would
like to use high quality online resources to assess students. The current study will address
the following question: are online resources effective as teaching and learning tools? As
explained in more detail below, we study this question by taking advantage of the natural
experiment that occurred when the provincial policy around the use of online resources was
strictly enforced in 2011.
From 2008 to 2012, all sections of a large enrollment economics course during the
regular school year (September-April) were taught by the same instructor, using the same
text book and largely the same evaluation methods. Until the Fall 2010 term, students wrote
two term tests and a final exam and did mandatory online homework, which was provided by
a publishing company for a fee. The fee included an online version of the text book at a
lower price than purchasing a new edition of the paper book. During the fall 2010 term, it
was brought to the attention of the instructor that the provincial policy regarding ancillary
fees prohibits instructors from mandating that students purchase access to such websites for
graded work. As a result of this new information, the online homework portion of the
grading scheme was dropped for the winter 2011 term. After consultation with
administrators, it was determined that the use of such online learning tools was acceptable as
long as they were optional to students. Beginning in fall 2011, students could purchase
access to the online homework and use it for grades. Students who did not choose to
purchase the online homework package had the weight transferred to their final exam. A
breakdown of the grading scheme for each term in the study is given in Table 1.
The two term tests were not weighted equally – the test with the higher score was given
more weight in the calculation of a student’s final grade. The exact weights are given in
parentheses. The online homework included a math test designed to review basic math
concepts taught in high school, weekly homework assignments based on lectures and
textbook readings, and, when the homework was mandatory, assignments based on online
experiments. During the experiments, hundreds of students met online in a simulated market.
Each student was given a role as either a buyer or seller, or a lender or borrower, and placed
bids through their internet browser. When the computer found a match between two parties,
the transaction was recorded. Online discussions followed several rounds of trading in the
market. For each of the two experiments in 2010, students were required to complete two
homework assignments, one prior to the experiment and one after the experiment was
completed. When the online homework system became optional, the experiments were
dropped from the course and the only online homework assignments were based on the
lectures and textbook readings.
In our analysis, we will examine student performance in the course overall as well as
their performance on individual tests and the exam. Table 2 shows the timing of the
homework assignments relative to the tests and final exam for the two terms with online
homework. The remainder of the paper is structured as follows: Section 2 presents a
literature review; Section 3 describes the methods used and data set development. Section 4
presents our results, and Section 5 concludes.
2. Literature Review
There are two bodies of literature that directly inform this study. The first relates to
student retention. Finnie and Qiu (2008) find that first year students are more likely to leave
university than students who make it into their second year of studies. We know that
students who have a connection to faculty tend to perform better than students who do not.
Most first year students face large classes and we also know that it is difficult to get this
connection to faculty in classes of 400 to 600 students (Nagda et al, 1998, Chickering and
Gamson, 1987). Finnie and Qiu also report that the most common reason students leave
university is “didn’t like it/not for me”. Chickering and Gamson emphasize time on task as a
core principle in good teaching. With very large classes, offering traditional paper
homework assignments, graded by the instructor or by teaching assistants, is impractical.
Instructors are using online homework both to engage students and to provide students with
more time on task, so gathering evidence of its efficacy is an important area of work.
Bosshardt (2004) studies the decision of first year economics students to drop the course.
Based on student characteristics, he divides the class into groups that are “good” students or
“at-risk” students, those with a high probability of receiving a grade of D or F. Some of the
at-risk students dropped the class while others did not. Bosshardt finds that of the at-risk
students who dropped the course, only 16% remained at the university they were attending.
One particularly interesting aspect of our project will be whether or not there are differential
impacts with online homework – does the online homework best serve good students or atrisk students? Does the answer depend on whether or not the homework is required or
optional?
Another body of work that is relevant pertains to the effectiveness of online homework in
economics. In a survey of 700 students conducted by Kennelly and Duffy (2007), students
overwhelmingly favoured online homework compared with traditional paper homework, felt
that using the publishers’ online homework tool helped improve their understanding of the
course material, and wanted their instructors to use the website in the future.
Nguyen and Trimarchi (2010) study two different publishers’ online homework tools,
finding that online homework had a small but statistically significant impact on student
grades. Use of such websites led to a 2% increase in grades whether or not the online
homework was required. The experiments Nguyen and Trimarchi (2010) conducted involved
several sections of the course taught by the same instructor, some sections having online
homework and others not having online homework. In one experiment, no variables were
included to control for ability or other student characteristics. In the other experiment
information about student characteristics was obtained via a student survey. Our study uses
administrative data and information from students’ university applications to determine
student characteristics, which are likely to be more reliable than survey responses.
Additionally, we are including three distinct cases in our study – one with mandatory online
homework, one with optional online homework, and one with no online component at all.
Lee, Courtney, and Balassi (2010) compare online homework with traditional paper
homework graded by a teaching assistant. The core measure of student learning is the
difference between pre- and post-test scores on the Test of Understanding in College
Economics (TUCE), a commonly used measure in the economic education literature. They
find that differing homework methods do not affect students’ improvement in TUCE scores,
despite anecdotal evidence that students prefer online homework. One drawback of the study
is the small sample size – they consider three sections of a microeconomics principles course
with enrollments of between 46 and 77. When we are considering large classes with
approximately 1,500 students in each of the three semesters of analysis, the choice is often
not between paper-based homework and online homework, but between online homework
and no homework at all. Clearly, this ties in with Chickering and Gamson’s emphasis on
time on task.
3. Methodology and Dataset Development
3.1 Natural Experiment Framework
To understand better the effectiveness of online homework on student performance in
large enrollment classes, we are utilizing a natural experiment type of methodology. As
explained above, prior to the winter term of 2011, the instructor of a large enrollment
economics course (five sections over an academic year, 2500 students across these sections)
the online homework was mandatory. For the winter term of 2011, no online homework was
offered. Starting in the fall term of 2011, online homework was offered but was an optional
component. Thus, we can study two natural experiments. Experiment #1 is the comparison
of a control group that does not have any online homework (winter 2011) with a treatment
group that has required online homework (winter 2010). Experiment #2 is the comparison of
a treatment group of students that are offered optional online homework (winter 2012) with
two control groups: those with no online homework (winter 2011) and those with required
online homework (winter 2010). Note that we are only using students enrolled in the winter
terms as we have observed slight differences in student background characteristics across the
fall and winter terms. Thus the comparison of the effects no online homework, mandatory
online homework, and voluntary online homework regimes are best done only on winter term
students.
In an ideal setup, we would observe the same student under both the control and the
treatment conditions. When it comes to matters such as education and other social science
based experiments, however, it is rare to accomplish this. Instead we must compare similar
students that are exposed to different treatments. This requires a level of confidence that the
students exposed to the different treatments are similar in all ways except for the instrument
being tested (the use of online homework). Fortunately, in our set up, we have the students
being exposed to the same instructor, the same lectures, the same text book, and the same test
bank of multiple choice questions. The students may differ in terms of program of study
(e.g. engineering, commerce, economics, social science) and they may differ in terms of the
level of preparation. We have developed a rich data set that will help us to control for these
differences in our multivariate analysis.
3.2 Data
We have constructed the data set by linking the following measures from four main
sources of data. The primary source of data was the course data collected by the instructor.
The data are collected for all students who registered to take the course in three winter terms:
winter 2010, 2011 and 2012. We have developed measures to reflect marks received by the
student on two term tests, the final exam, as well as marks on each of the 14 homework
assignments and math evaluation test for the terms where homework was offered. Additional
measures were created to capture the weight of each test and each homework assignment
used to calculate the final course grade. Both the term tests and the final exam consisted of
multiple choice questions and short answer questions. To ensure consistency, we are using
only the multiple choice sections from the term tests and final exam for all terms under study.
The second data source was obtained from the university’s Office of the Registrar and
includes information on other courses taken by the students. From that source, we created
measures to capture overall student performance in the fall term preceding the course as well
as the winter term of the course. We calculated the number of credits taken (full or partial
load), average GPA in each term, and flagged any other economics courses taken by the
student, as well as previous attempts at the course under study. Other measures include the
level of enrolment (level 1, level 2, etc), and program of study where the student is registered
during the term in question (e.g. engineering, commerce, social science, humanities). The
Registrar’s data also provide us with information on students who withdrew from the
university entirely but were listed in the course data as “failed the course”.
The third data source was information on the applications submitted by the students for
admission to the university through the Ontario University Application Centre. The data
records include applications for students applying directly from an Ontario high school
(known as “101” students) and delayed entry and/or non-Ontario high school students
(known as “105” students). The 101 set of applications capture information on the students’
performance in level 4 (grade 12) courses in high school and their home postal code. From
the 105 set of applicants the high school grade information was more limited. The location
of their residence and home postal code (if from Canada), however, were available. Using
the high school course information, we calculated the average of the best 6 grade 12 U-level
courses, which is equivalent to the general admission requirements for the university.
The fourth source of the data was the 2006 Canadian Census. Using the home postal code
measures, we added the information on socio-economic characteristics of students’
neighbourhoods such as average household income, share of immigrants, age and education
profiles. The census geography utilized was the dissemination area, a geography that covers
roughly 500 households.
Table 3 captures basic information on the students who took the economics course in
2010, 2011, or 2012 and compares them with entry level students at the university in these
years as well as entry level students across all universities in Ontario. Compared to the
university as a whole, the economics course contains a higher share of male students, higher
shares of commerce and engineering students, and higher shares of immigrant students.
There are lower shares of science/health and humanities students. Overall, however, the
students share similar high school GPAs and other characteristics.
In Table 4 we report summary statistics that compare students enrolled in the three terms
the macroeconomics course was offered. Many of the demographic characteristics of students
in the three terms are comparable. About the same proportion of students attended 6 or more
years in a Canadian school system, speaks English as their first language, applied for OSAP,
and come from low-, middle-, and high-income neighbourhoods. Performance in high school
is also similar across students registered in all three terms. The average of the students’ best
six grade 12 courses is virtually identical, at about 86%, and almost all 101 students had at
least one grade 12 math course. Another strikingly similar characteristic is the proportion of
students carrying a full time course load during the terms under study, 83-84% in each term.
There are a few moderate differences across terms. In 2010 the proportion of males taking
the economics course was lower while the proportion of Canadian citizens was higher. A
smaller proportion of the class was comprised of business and engineering students in 2010
than in the other terms. This is likely due to scheduling issues, and it seems plausible that
business and engineering students took the course in the fall 2009 term rather than the winter
term in 2010. The largest differences can be seen in the students’ performance on the online
math test (for the two semesters the online homework was available) and in the percentage of
the class enrolled in level 1. The average in the online math test was 92% in 2010 and 82%
in 2012. Incentives may matter – in 2010 the math test was worth 5% of the students’ final
grade, while in 2012 it was only worth 3%. Level 1 students make up a higher proportion of
the class in 2012 than either 2010 or 2011 (80%, 68%, and 72% respectively). Overall, the
students appear quite similar in all three of the terms under study.
3.3 Empirical Model
Our focus is on student performance on the multiple choice section of two term tests and
the final exam. Student i’s test score, Ti, depends individual student information obtained
from course records, the Registrar and the student’s university application, Xi, and
information about the dissemination area the student resided in when they applied for
university, Ni.
(1)
T௜ = Ⱦ଴ + Ⱦଵ Hi + X௜ ȾX + N௜ ȾN + ɂ௜
Individual characteristics include a dummy variable for online homework, Hi, faculty,
gender, level at the university (first year or higher), English as a second language, high
school average grade (best six courses), OSAP, years in Canadian school system, age,
citizenship, and a dummy variable for ‘105’ students. Neighbourhood characteristics include
measures of income and educational attainment, visible minority percentage, percentage of
one-parent families, youth unemployment, population, and proportion of population between
the ages of 15 and 24. Robust standard errors are clustered by DA level. Standard errors are
clustered when members of a group (in this case census dissemination area, approximately
500 households) are likely to share some characteristics that are unobservable.
4. Results
4.1 Mandatory Homework Compared with No Homework
Figure 1 shows the average score on the multiple choice section of two term tests and the
final exam for the year in which online homework was mandatory and year without online
homework. Mean scores are slightly lower on the term tests, dramatically lower on the final
exam. When online homework was mandatory, the mean on the final exam MC section was
67.8%, compared with 55.2% in the term with no homework. Distributions of the test and
exam grades are given in Figure 2. Again, the difference in term test grade distributions is
not large. For the final exam, the shapes of the distributions are strikingly similar, but the
entire distribution is shifted to the left in the term with no online homework.
Regression results are provided in Tables 5A and 5B. Our key variable of interest, a
dummy variable equal to one for students with online homework, is statistically significant
for both term tests and the final exam. A student’s grade on each term test is likely to be
about two percentage points higher in the term with mandatory homework, while the final
exam grade is over twelve percentage points higher. Faculty of enrolment has a statistically
significant impact on test performance in most cases. Relative to Social Sciences students,
Business and Science/Health Science students perform better on both tests and the final
exam. The only significant difference for Engineering students is on the final exam,
outperforming Social Sciences students by over three percentage points. Humanities students
fare worse on all three assessments, but the difference on the first test is not statistically
significant.
Any grade difference of more than three percentage points is meaningful to the students
in the sample. The university under study uses a twelve point grading system to calculate
cumulative averages, where an “F” equals zero, “D-” equals one, “D” equals two, up to “A+”
equalling twelve. For grades in the “D”, “C”, and “B” ranges, an increase in grade of three
percentage points is enough to move students up to the next grade level. For example, a
student with a grade of 67% has a C+, which translates to six out of twelve, while a student
with 70% has a B-, which is seven out of twelve.
Gender, high school grades, and being an upper year student all have statistically
significant and meaningful impacts on test and exam performance. The largest of these
coefficients is on “upper year”, with upper year students scoring over five percentage points
better on the term tests and almost three percentage points better on the final exam. Females
perform between one and two percentage points worse than males, and a one percentage
point increase in high school average leads to about a one percentage point increase in test
and exam performance. Census measures are all numerically small and insignificant, with the
exception of income. Students in lower income neighbourhoods have slightly lower scores on
both term tests and the final exam. However, only two coefficients are statistically
significant: the final exam for the lowest tercile and the first term test for the middle tercile.
4.2 Mandatory Homework Compared with Optional Homework
Figures 3 and 4 show the test/exam means and grade distributions for mandatory
homework and optional homework. These figures look strikingly like Figures 1 and 2, where
we compared mandatory homework with no homework. Performance on the first term test
and final exam are lower in the year with optional homework, with the final exam mean
dropping by more than eleven percentage points when homework is optional. The mean
scores on the second term test are identical in the two years.
Optional homework leads to a four percentage point decrease in Test #1 and almost
twelve percentage point decrease in the final exam compared with mandatory homework.
Performance on test 2 is not affected by the optional nature of the online homework. A
student’s faculty of enrollment affects test/exam performance, with Business, Engineering,
and Science/Health students outperforming Social Science students on all assessments by 1.5
to 5.6 percentage points. All are statistically significant except for Engineering students’
performance on the second term test.
Being an upper year student, male, a native English speaker, and having a higher high
school average all contribute to higher scores on the tests and final exam. Living in a low
income neighbourhood negatively impacts test/exam performance. None of the other census
variables are statistically significant or meaningful in terms of coefficient size.
5. Conclusions
Does online homework improve student performance on term tests and exams? When the
homework is mandatory, the answer is yes. Students with mandatory online homework did
better on term tests by two to four percentage points than students with no homework or with
optional homework. Final exam results were even more striking – mandatory online
homework leads to an eleven to twelve percentage point increase in the final exam score,
again compared with no homework or optional homework. These results were obtained using
large samples of over 2500 students, controlling for many individual and neighbourhood
characteristics.
The next step in this research is to compare the performance of students who chose to do
the homework with those who did not during the term when homework was optional. We
also need to examine the relationship between homework and test performance for weaker
students compared with stronger students. We will conduct propensity score matching
analysis to further explore the relationship between online homework and test/exam
performance.
References
Bosshardt, W. (2004) "Student drops and failure in principle courses", Journal of Economic
Education, 35(2)111-28.
Chickering, A., Gamson, Z. (1987) Seven principles for good practice in undergraduate
education, American Association of Higher Education Bulletin, 39 (7) 3-7.
Finnie, R., Qiu, H. (2008) The patterns of persistence in post-secondary education in Canada:
evidence from the YITS-B dataset, accessed 19 March 2014 at
http://higheredstrategy.com/mesa/pub/pdf/MESA_Finnie_Qiu_2008Aug12.pdf.
Kennelly, B., Duffy, D. (2007) Using Aplia software to teach principles of economics, paper
presented to Developments in Economic Education Conference in Cambridge, UK, 6-7
September.
Lee, W., Courtney, R., and Balassi, S. (2010) Do online homework tools improve student
results in principles of microeconomics courses? American Economic Review, 100(2)
283-86.
Nagda, B. A., Gregerman, S. R., Jonides, J., von Hippel, W., Lerner, J. S. (1998)
Undergraduate student-faculty research partnerships affect student retention, The Review
of Higher Education 22 (1) 55-72. Retrieved from http://wwwpersonal.umich.edu/~jjonides/pdf/1998_6.pdf
Nguyen,T. and Trimarchi, A. (2010) Active learning in introductory economics: do
MyEconLab and Aplia make any difference?, International Journal for the Scholarship
of Teaching and Learning, 4(1)10.
Table 1
Winter, 2010
Term Tests (25%,
15%):
Final Exam:
Online Math Test:
Online Homework:
40%
40%
5%
15%
Winter, 2011
Term Tests (30%,
20%):
Final Exam:
50%
50%
Winter, 2012
Term Tests (25%,
20%):
Final Exam:
Online Math Test:
Online Homework:
45%
40%
3%
12%
Table 2: Course Schedule, Winter 2010 and Winter 2012
Winter, 2010
Week
2
Date
Jan. 15
3
4
Jan. 18
Jan. 22
Jan. 25
5
6
Jan. 29
Feb. 6
Feb. 12
7
8
9
Feb. 15-19
Feb. 24
Mar. 5
10
Mar. 13
11
Mar. 19
12
Mar. 22
Mar. 26
Mar. 29
13
14
Apr. 5
Apr. 9
Apr. 14
Winter, 2012
Homework
Math Test
Ch. 2
Experiment #1 Prep
Ch. 3
Experiment #1
Analysis
Ch. 4
Test
Test # 1 (1-4)
Ch. 5
Ch. 6 (practice)
Reading Week
Ch. 7
Ch. 9 (practice)
Ch. 10
Test #2 (5-10)
Ch. 11
Ch. 12 (practice)
Experiment #2 Prep
Ch. 13
Experiment #2
Analysis
Ch. 14
Ch. 15
Final Exam
Week
2
Date
Jan. 13
Homework
Ch. 2
3
Jan. 17
4
Jan. 24
Ch. 3
Math Test
Ch. 4
5
6
Jan. 31
Feb.7
7
8
9
Feb. 11
Feb. 14
Feb. 20-24
Mar. 2
10
Mar. 6
11
Mar. 10
Mar. 13
Ch. 12
12
Mar. 20
Ch. 13
13
Mar. 27
Ch. 14
14
Apr. 3
Ch. 15
Apr. 7
Test
Ch. 5
Ch. 6
Ch. 7
Test # 1 (1-6)
Ch. 8
Reading Week
Ch. 9
Ch. 10
Ch. 11
Test # 2 (7-11)
Final Exam
Table 3: Sample comparison with the student population
Sample
students
Total students in sample
Delayed or non-Ontario students (% 105 students)
Gender (% male)
Immigrant status
Canadian Citizen
Permanent Resident
Other
Years in Canadian K-12 school system
6 years or more
3-5 years
2 years or less
% with English reported as their primary language
Indication on application that an application for
financial aid was submitted (% applied)
Students with ON postal code (3)
% living in low income neighbourhood
High School GPA (Best 6 University or Mixed
Courses)
Average Best 6 GPA
(standard deviation)
(Note: Differences of group means are statistically
different from zero at 1% confidence level across all
groups)
5th percentile of best 6
Median best 6
95th percentile of best 6
Students by Program of Registration, Year 1 (4)
Commerce (includes Social Sciences)
Engineering
Science/Health
Humanities
Other
NOTES:
(1)
4,295
11.0%
63.4%
All (1)
students at
university
under study
(2)
18,505
All(2)
Ontario
students
(3)
255,633
46.7%
43.9%
77.8%
11.0%
11.2%
88.4%
7.3%
4.3%
91.1%
5.7%
3.1%
76.0%
13.4%
10.6%
65.6%
88.3%
7.2%
4.5%
75.8%
90.8%
5.1%
4.1%
76.9%
61.7%
67.6%
65.4%
4,019
15.3%
18,427
15.3%
254,933
20.3%
86.2%
(5.1)
86.1%
(5.5)
83.3%
(6.5)
78.2%
85.7%
95.5%
76.8%
86.0%
95.5%
72.5%
83.5%
93.8%
26.7%
28.2%
24.5%
13.5%
7.2%
11.9%
19.1%
33.2%
30.8%
5.0%
14.1%
9.0%
28.4%
40.0%
8.5%
(1) All students who applied directly from HS in 2010, 2011 and 2012 and were registered at the university
under study.
(2) All students who applied directly from HS in 2010, 2011 and 2012 and were registered at any University.
(3) OUAC 101 students (direct applicants) may have a non-Ontario postal code if they attend an International
Ontario approved High School. For schools located in Ontario, if student postal code was invalid, school postal
code was used.
(4) Program of registration is based on transcript data for column 1 and application data for columns 2 and 3
Table 4A: Demographic Characteristics of Sample
Students: Comparison Across Terms
Absence of
Online
Homework
Presence of
Online
Homework
Winter
2011
(1)
1432
64.73%
Winter
2010
(2)
1417
59.14%
Winter
2012
(3)
1,454
66.23%
76.40%
11.45%
12.15%
80.17%
10.87%
8.96%
76.82%
10.80%
12.38%
Students by number of years in Canadian school system (at admission)
6 years or more
73.95%
77.77%
3-5 years
15.08%
12.35%
<= 2 years
10.96%
9.88%
76.07%
12.93%
11.00%
Total students per term
Gender (% male)
Immigrant Status
Canadian Citizen
Permanent Resident
Other
% with English reported as their primary language
64.53%
66.97%
Students by OSAP application status
Appied for OSAP (% applied)
60.68%
62.74%
Students by neighbourhood of residence (if residing in Ontario at admission)
Students with ON postal code
1319
1351
% living in low income neighbourhood (bottom tercile)
20.28%
20.95%
% living in mid-income neighbourhood (middle tercile)
23.25%
23.59%
% living in high income neighbourhood (top tercile)
56.46%
55.46%
Median distance to the University, km
50.5
47.7
65.27%
61.62%
1,356
18.66%
21.50%
59.84%
46.9
Table 4B: High School Information of Sample
Students: Comparison Across Terms
Absence of
Online
Homework
High School GPA (Best 6 University or Mixed Courses)
Average Best 6
Standard Deviation of Best 6
Average Grade difference with Winter 2010 students
Presence of
Online
Homework
Winter
2011
Winter
2010
Winter
2012
(1)
(2)
(3)
86.1
(5.0)
0.1
(0.2)
86.0
(5.3)
85.7
85.3
86.4
(4.9)
0.4
(0.2)
0.2
(0.2)
85.8
1259
97.5%
88.9%
1263
97.8%
86.9%
2.48%
10.53%
57.78%
27.27%
1.94%
86.6
(8.4)
88
2.24%
5.19%
61.46%
30.80%
0.31%
87.6
(7.9)
89
Average Grade difference with Winter 2011 students
Median Best 6
High School Math (average of all grade 12 University stream courses)
# students with at least one Math course
1221
(% 101 students)
98.2%
(% of all students)
85.3%
% taking level 12 Math courses (101 students)
0 math courses
1.85%
1 math course
7.07%
2 math courses
62.22%
3 math courses
28.54%
0.32%
4 or 5 math courses 1
Average of Best Math grade
87.0
(standard deviation)
(8.2)
Median Best Math grade
89
1
Only 5 students in the entire sample took 5 grade 12 math courses, all of whom were in the economics course in
2010.
Table 4C: University Information of Sample
Students: Comparison Across Terms
Absence of
Online
Homework
Presence of
Online
Homework
Winter
2011
Winter
2010
Winter
2012
(1)
(2)
(3)
27.65%
12.08%
27.72%
29.12%
2.72%
0.63%
23.78%
13.62%
24.84%
34.65%
2.40%
0.64%
28.27%
11.83%
31.64%
24.97%
2.89%
0.41%
% students registered in Level 1
71.79%
68.38%
Concurrent or Past Enrollment in First Year Microeconomics Course
Number of students
1,215
1,159
Mean performance in microeconomics course (scale of
7.8
8.5
12)
(standard deviation)
(2.9)
(2.6)
79.78%
Students by Enrolled Faculty At Time of Course
Business
Social Sciences
Engineering
Science/Health
Humanities
(N/A)
Performance in Course Online Math Test
(standard deviation)
1,305
7.5
(3.0)
92.0
(14.5)
81.9
(31.3)
13.7
(3.2)
3
15
21
83.98%
13.9
(3.5)
3
15
21
83.22%
Overall Term Performance
What was the total # credits taken in the Term
Average # of credits
(standard deviation)
Minimum credits
Median credits
Max credits
Share of students with "full time loads"
13.8
(3.4)
3
15
22
83.80%
Figure 1
80
Average Term Test and Exam Scores
Based on Online Participation
78.8
76.7
74.2
40
Percentage on Test
50
60
70
71.8
67.8
55.2
Term Test One
Term Test Two
Winter 2010
Mandatory Online Homework
Final Exam
Winter 2011
No Online Homework
Figure 2A
Distribution of Midterm Grades by Term
Fraction of Students
.02
.03
.01
0
0
.01
Fraction of Students
.02
.03
.04
Term Test Two
.04
Term Test One
0
10 20 30 40 50 60 70 80 90 100
Distribution of Grades
Winter 2010
0
10 20 30 40 50 60 70 80 90 100
Distribution of Grades
Winter 2011
Figure 2B
0
Fraction of Students
.01
.02
.03
.04
Distribution of Final Exam Grades by Term
0
10
20
30
40
50
60
Distribution of Grades
Winter 2010
70
Winter 2011
80
90
100
Table 5A
Winter 2010 (Mandatory Online Homework) and Winter 2011 (No Online Homework) Midterms and
Final Exam Regressions
VARIABLES
Online Homework Mandatory
Transcript Measures
Business
Engineering
Science/ Health
Humanities
OUAC Measures
Upper Year Student
Distance from University
Female
Non-English First Language
Average Best 6 High School Grade
OSAP Applicant
Years in Canadian School System
Age at time of Course (year taking
course - year of birth +1)
Canadian Citizen
OUAC 105 Students
(1)
(2)
(3)
Final Exam
Grade,
Percentage on
Multiple Choice
Test 1 Grade,
Percentage on
Multiple Choice
Test 2 Grade,
Percentage on
Multiple Choice
2.106
(0.484)**
2.342
(0.475)**
12.613
(0.393)**
2.626
(0.891)**
0.934
(0.921)
4.011
(0.999)**
-1.102
(1.793)
3.571
(0.947)**
1.622
(0.985)
4.158
(1.083)**
-3.541
(1.729)*
1.670
(0.747)*
3.358
(0.758)**
5.643
(0.873)**
-3.275
(1.340)*
5.810
(0.645)**
0.006
(0.003)*
-1.659
(0.499)**
-0.634
(0.628)
1.013
(0.056)**
-0.186
(0.506)
0.120
(0.106)
5.694
(0.706)**
0.004
(0.003)
-1.346
(0.529)*
-0.679
(0.616)
1.048
(0.055)**
-0.101
(0.519)
0.156
(0.084)
2.750
(0.593)**
0.006
(0.002)**
-1.974
(0.424)**
0.232
(0.516)
1.130
(0.047)**
-0.139
(0.409)
0.208
(0.076)**
0.211
(0.193)
0.132
(1.038)
0.724
(0.291)*
0.202
(0.201)
-1.515
(0.970)
0.032
(0.303)
0.013
(0.186)
-0.270
(0.839)
-0.083
(0.261)
Table 5B
Winter 2010 Mandatory Online Homework and Winter 2011 No Online Homework Midterms and
Final Exam Regressions
Test 1 Grade,
Percentage on
Multiple Choice
Test 2 Grade,
Percentage on
Multiple Choice
Final Exam
Grade,
Percentage on
Multiple Choice
0.012
(0.011)
-0.004
(0.030)
0.012
(0.012)
-0.002
(0.030)
0.010
(0.009)
0.026
(0.025)
% Pop. With High School Diploma
0.067
(0.049)
0.058
(0.050)
0.070
(0.042)
% Pop. With Trade Certificate
0.034
(0.077)
-0.008
(0.083)
0.021
(0.064)
-0.046
(0.049)
-0.013
(0.054)
-0.006
(0.043)
0.031
(0.083)
0.015
(0.033)
0.024
(0.085)
0.025
(0.034)
0.031
(0.071)
0.077
(0.027)**
Bottom Third DA Income Tercile
-1.415
(0.773)
-1.602
(0.835)
-1.479
(0.641)*
Middle Third DA Income Tercile
-1.370
(0.641)*
-0.708
(0.644)
-0.733
(0.516)
% Unemployed age 15-24
-0.064
(0.072)
0.000
(0.078)
0.034
(0.064)
-0.067
(0.113)
0.079
(0.059)
-94.779
(28.670)**
-0.089
(0.101)
0.024
(0.066)
-32.011
(30.031)
-0.008
(0.094)
0.086
(0.049)
-45.009
(25.505)
2,577
0.304
2,540
0.289
2,602
0.516
VARIABLES
Census Measures (DA Level)
% Pop. Visible Minority
% Pop. 1 Parent Family
% Pop. With College Certificate or
Diploma
% Pop with University Certificate of
Diploma
% Pop with BA or Higher
Total Population in Thousands
% Population age15-24
Constant
Observations
R-squared
Clustered standard errors in parentheses
** p<0.01, * p<0.05
Figure 3
80
Average Test Scores Based on Online Participation
78.8
74.2
74.2
Percentage on Test
60
70
74.4
67.8
40
50
56.4
Term Test 1
Term Test 2
Winter 2010 Mandatory
Online Homework
Final Exam
Winter 2012 Optional
Online Homework
Figure 4A
Distribution of Midterm Grades by Term
Term Test One
0
0
.01
.01
Fraction of Students
.02
.03
Fraction of Students
.02
.03
.04
.04
Term Test Two
0
10 20 30 40 50 60 70 80 90 100
Distribution of Grades
Winter 2010
0
10 20 30 40 50 60 70 80 90 100
Distribution of Grades
Winter 2012
Figure 4B
0
Fraction of Students
.01
.02
.03
.04
Distribution of Final Exam Grades by Term
0
10
20
30
40
50
60
Distribution of Grades
Winter 2010
70
80
Winter 2012
90
100
Table 6A
Winter 2010 Mandatory Online Homework and Winter 2012 Optional Online Homework Midterm
Regressions
VARIABLES
Online Homework Optional
Transcript Measures
Business
Engineering
Science/ Health
Humanities
OUAC Measures
Upper Year Student
Distance from University
Female
Non-English First Language
Average Best 6 High School Grade
OSAP Applicant
Years in Canadian School System
Age at time of Course (year taking
course - year of birth +1)
Canadian Citizen
OUAC 105 Students
(1)
(2)
(3)
Test 1 Grade
Percentage on
Multiple Choice
Test 2 Grade
Percentage on
Multiple Choice
Final Exam
Grade,
Percentage on
Multiple Choice
-4.079
(0.429)**
0.307
(0.473)
-11.065
(0.394)**
3.190
(0.813)**
2.114
(0.809)**
5.626
(0.863)**
0.793
(1.467)
2.139
(0.943)*
1.578
(0.938)
4.662
(1.015)**
-2.398
(1.909)
3.049
(0.758)**
2.517
(0.769)**
5.635
(0.861)**
-2.400
(1.379)
4.108
(0.618)**
0.004
(0.003)
-1.695
(0.468)**
-1.109
(0.562)*
0.940
(0.047)**
0.207
(0.452)
0.040
(0.100)
3.140
(0.679)**
0.007
(0.003)**
-0.756
(0.506)
-2.041
(0.638)**
1.052
(0.053)**
-0.633
(0.515)
0.029
(0.110)
4.044
(0.610)**
0.008
(0.002)**
-1.951
(0.441)**
-1.281
(0.528)*
1.146
(0.046)**
-0.177
(0.412)
0.053
(0.094)
-0.002
(0.215)
1.106
(0.963)
0.484
(0.306)
0.088
(0.231)
-0.333
(1.151)
-0.068
(0.352)
0.037
(0.209)
1.015
(0.934)
-0.003
(0.297)
Table 6B
Winter 2010 Mandatory Online Homework and Winter 2012 Optional Online Homework Midterm
Regressions
Census Measures (DA Level)
% Pop. Visible Minority
0.015
(0.010)
0.015
(0.029)
0.030
(0.011)**
0.018
(0.030)
0.012
(0.009)
0.042
(0.025)
% Pop. With High School Diploma
0.029
(0.048)
0.091
(0.050)
0.044
(0.042)
% Pop. With Trade Certificate
0.075
(0.070)
0.016
(0.075)
-0.048
(0.061)
-0.020
(0.046)
-0.038
(0.051)
-0.042
(0.040)
-0.028
(0.075)
0.036
(0.031)
-0.105
(0.083)
0.021
(0.034)
-0.128
(0.070)
0.041
(0.027)
Bottom Third DA Income Tercile
-1.568
(0.802)
-1.966
(0.845)*
-1.462
(0.651)*
Middle Third DA Income Tercile
-0.643
(0.589)
-0.065
(0.069)
-0.706
(0.641)
-0.075
(0.082)
-0.084
(0.516)
-0.078
(0.064)
-0.115
(0.110)
0.109
(0.063)
-59.196
(30.045)*
-0.046
(0.121)
0.030
(0.067)
-16.348
(34.456)
-0.137
(0.099)
0.078
(0.049)
-38.033
(28.905)
2,640
0.313
2,523
0.266
2,649
0.493
% Pop. 1 Parent Family
% Pop. With College Certificate or
Diploma
% Pop with University Certificate of
Diploma
% Pop with BA or Higher
% Unemployed age 15-24
Total Population in Thousands
% Population age15-24
Constant
Observations
R-squared
Clustered standard errors in parentheses
** p<0.01, * p<0.05
The Effectiveness of Tutorials in Large Classes:
Do they matter? Is there a difference between traditional and collaborative learning tutorials?
Karen Menard**
Bridget O’Shaughnessy*
A. Abigail Payne*
With Olesya Kotlyachkov* and Bradley Minaker*
June 2014
*Department of Economics and Public Economics Data Analysis Laboratory (PEDAL),
McMaster University
**Ontario Institute for Cancer Research
I.
Introduction
Across much of Ontario and Canada, a typical first-year university student experience
involves the enrollment in a large class. While there are many benefits for universities to use
large classes from a financial and resource perspective, the impact on students, particularly
weaker students, tends to be overlooked. Struggling students may not seek help and/or may
disengage from their studies. Ultimately this may lead to bad decisions around the program of
study, academic choices made in later years and, in many instances, may lead to dropping out.
As illustrated in Dooley, Payne, and Robb (2011), high school grades are a strong indicator of
success in university and first-year students are more likely to leave university than students who
make it into their second year of studies (see also Finnie and Qiu, 2008).
In this report we examine the use of tutorials in a large enrollment course and analyze the
benefits of tutorials in such courses. It has been demonstrated that students who develop a
connection with faculty tend to perform better than students who do not develop such a
connection. Studies have shown that it is difficult to develop a student/faculty connection in
classes of 400 to 600 students (see Nagda, 1998, Chickering and Gamson, 1987, and Finnie and
Qiu, 2008). If students are engaged in their first year, this may have a positive impact on their
university progression. However, given the reality of resource constraints on universities and the
new practice of large classes, how can students be appropriately engaged in the context of a large
class? Admittedly, poor performance in any one class is unlikely to cause a student to leave
post-secondary education (PSE), and good performance in any one class is equally unlikely to
entice a student to stay in PSE. It seems reasonable to believe, however, that helping a student
improve performance in one class could mitigate other factors that might cause the student to
leave university. If an approach were successful in improving at-risk student performance in one
course, it could possibly be replicated in many first year courses and have an impact on
persistence.
If tutorials in large enrollment courses are successful at engaging students, does it matter
if the tutorial is conducted as a traditional tutorial (i.e., with a teaching assistant working through
a problem with the students) or as a collaborative learning tutorial (i.e., groups of students
working through a problem together and the teaching assistant providing assistance to the
students). Collaborative learning has been extensively studied in science and engineering
programs. Felder (1995) and Felder, Felder, and Dietz (1998) found that collaborative learning
and active learning improved student outcomes and student satisfaction in a sequence of large
(90 to 123 students) chemical engineering courses. The instructional techniques used in the
Felder studies, however, did not seem to help the weakest students in the class. A meta-analysis
of 39 studies in science, mathematics, engineering, and technology courses showed a positive
and statistically significant impact on student achievement, motivation, and attitudes when
cooperative learning or collaborative learning methods were used (see, Springer, Stanne, and
Donovan, 1999).
Two studies have examined collaborative learning in the social sciences but for relatively
smaller classes than what is being proposed. Yamarik (2007) studied class sizes of 25-35
students and focused on students in second/third year of university. He found collaborative
learning classes to be more effective in leading toward student success than traditional classes.
Huynh, Jacho-Chaves, and Self (2010 and 2011) studied classes of 200 students in first-year
studies. They found benefits to collaborative learning but lacked an experimental design that
allowed them to compare the benefits to other methods such as traditional tutorials. A key
finding of Huynh, Jacho-Chaves and Self (2011) is that collaborative learning had a particularly
strong positive impact on students falling in the bottom 40th percentile.
In this report we analyze an intervention conducted during the 2012/2013 academic
school year for a large class in economics. This course typically enrolls over 2,500 students each
year across five sections. The students in this course represent all faculties across the campus as
the course is a pre-requisite for many programs. The backgrounds of students and their level of
preparation for the course vary. Previous attempts at low-cost interventions to improve student
performance in this course were ineffective. For example, in 2009/2010, a random set of
students who performed poorly on the first test were personally emailed by the instructor and
provided information on academic resources. Final course performance by these students was no
different than students who were not emailed but performed similarly on the first test.
The innovation in our study is to examine the effectiveness of collaborative learning in
larger classes (approximately 500-600 students per class) and to compare and contrast the
effectiveness of collaborative learning versus traditional tutorials. In previous studies with
smaller class sizes, most of the students participated in the tutorials. With very large classes,
students may feel anonymous and attendance at tutorials may be lower than in a small class
setting. As many departments face increasingly tight budgets, there is a trend towards
eliminating tutorials, partly due to a perceived lack of student participation. Our study will begin
to establish participation rates and identify which type of tutorial methods may be more
beneficial in engaging students in large enrollment courses.
Overall we find a high proportion of students participate in the first tutorial. We also find
that close to 70% of the students attend at least 3 tutorials but that less than half the students
attend all five tutorials. We find a measurable impact of tutorial attendance on course exams and
on the final grade. Students who participated in all five tutorials performed better than those who
only attended three tutorials. The traditional tutorials have a stronger (positive) effect on course
performance. There is, however, a stronger positive correlation between the collaborative
learning tutorials and performance on optional online homework assignments.
The following section of the report outlines the course and experimental design. In
section III we discuss the data, the selection of students studied and present summary statistics.
Finally in sections IV and V we present the analysis and the discussion, respectively.
II.
Experiment Design
Our experiment focused on instruction of introductory macroeconomics at an Ontario
university. This course was taught by a single instructor and followed a similar structure for
seven years. Each year, five sections of the course were offered: two in the fall (September to
December) and three in the winter (January to April). The enrollment in each section ranged
between 400 and 600 students, with a total enrollment of approximately 2,400 students each
academic year. At the university under study, students taking introductory macroeconomics,
however, are registered in all of the faculties as there are several programs across the various
faculties for which this course is a requirement (e.g., engineering, commerce). Thus the students
taking this course are diverse in their backgrounds, particularly in their academic preparation.
This course, as is typical with economics courses, requires strong math and analytical skills.
Prior to the year in which tutorials were introduced, students were evaluated based on their
performance on two term tests and a final exam. The instructor offered optional online
homework. A student who completes the online homework assignments can use her
performance on the homework to reduce the weight allocated to her final exam. For a student
who chooses not to complete the online homework assignments, the final grade was allocated as
follows:
„ 25% of the marks received on the better of two term tests (typically test #1)
„ 20% of the marks received on the other term test
„ 55% of the marks received on the final exam
If a student completed the online homework assignment then the weight of the final exam
was reduced to 40%. The 15% for the online homework was allocated as 5% of the marks
received on a basic math test offered in the first two weeks of the course and 10% of the marks
received on weekly homework assignments. The math component of the online homework is
designed to review concepts learned in high school.
To ensure tutorial attendance, the instructor offered an incentive. The tutorials were not for
grades, but students were offered a “grade weight shift” as an incentive to attend. The grade
weight shift allowed students to shift a small portion of the weight from the final exam to their
higher term test grade. Students who performed better on the final exam than both tests would
not be penalized because their shift would work in the opposite direction. Historically, the
course average was higher on Test #1 than on Test #2, and both term test averages were typically
higher than the average on the final exam. Usually only five percent of the class performed
better on the final exam than both term tests. The percentage shift was as follows:
Attend all five tutorials:
Attend four tutorials:
Attend three tutorials:
Shift 5% of final exam weight to highest term test
grade. This results in the better term test being worth
30%.
Shift 4% of final exam weight to highest term test
grade. This results in the better term test being worth
29%.
Shift 3% of final exam weight to highest term test
grade. This results in the better term test being worth
28%.
Attend zero to two tutorials:
No weight shift, grade is calculated according to
scheme described above.
Tutorials were provided for the students in Introductory Macroeconomics during the 2012/2013
academic year for the first time in two decades. Tutorials were held bi-weekly. There were five
tutorials held each semester, beginning in Week 4 in the fall and Week 2 in the winter. The
schedule for the tutorials and tests and the subjects covered (chapters of a text book) were as
follows:
Week
2
3
Fall, 2012
Date
Tutorial
4
Sept. 17
Sept. 19
Sept. 24
5
6
Oct. 1
Oct. 9
7
8
Oct. 15
Oct. 22
9
10
Oct. 29
Nov. 5
11
12
Nov. 13
Nov. 20
13
14
Nov. 26
Dec. 3
Test
Chapter
1-3
Chapter
4-6
Test 1 (1-6)
Chapter
7-9
Chapter
10,11
3
Winter, 2013
Date
Tutorial
Jan. 17
Chapter
Jan. 18
1-3
Jan. 23
4
Jan. 30
5
6
Feb. 6
Feb. 13
7
8
Feb. 18-22
Feb. 27
Week
2
9
Test 2 (7-11) 10
Mar. 6
Mar. 13
11
12
Mar. 27
13
14
Apr. 3
Apr. 10
Chapter
12,13
Final Exam
Test
Chapter
4-6
Test 1 (1-6)
Chapter
7,8
Chapter
9-11
Test 2 (7-11)
Chapter
12,13
Final Exam
In the fall, most tutorials had an enrollment of approximately 70 students, with two exceptions.
The tutorials that met from 8:00 pm to 8:50pm on Tuesdays and Wednesdays had 15 and 19
students, respectively.
Tutorials were fundamentally different in the fall and winter semesters. In the fall, Teaching
Assistants (TAs) stood in front of the tutorial section and delivered a “traditional” tutorial by
solving a set of problems on a black board. Students were expected to bring questions and
problems to the tutorial, having printed them from the course learning management system the
week before and attempted them on their own. The TA showed the solutions to the students and
then provided a corresponding multiple choice question and also went through the answer to the
multiple choice question.
In the winter semester, collaborative learning among students was the goal. Students were
given the problem sheets at the beginning of the tutorial and told to work on the questions in
small groups. Again, each chapter had two or three short answer problems with accompanying
multiple choice questions. Students worked in groups sizes of 3-4 for approximately thirty
minutes and then share their answers with other groups working nearby. TAs traveled
throughout the room and assisted groups as needed. They were instructed to provide guidance,
but not to directly tell the students the answers to the problems.
III.
Data Description, Sample Selection and Summary Statistics
To study the effects of offering tutorials in large classes four data sources were utilized.
The primary data are the records of the students from the course. The measures from this source
capture information on (a) student tutorial participation, (b) performance on two course exams
and one final exam, (c) participation in and performance on the online homework assignments,
(d) adjusted final course grade (excludes performance on the online homework assignments), (e)
an anonymous identification of the tutor assigned to each section. These measures were
collected on students who participated in the course for both the 2011/2012 academic year (pretutorials, control) and the 2012/2013 academic year (tutorials, treatment). The second data
source was obtained from registrar data held at the university. The core measures collected from
the registrar were program of registration and enrollment and performance in all courses by the
student. The third data source was information on the applications submitted by the students for
admission to the university. The data on applications included applications for students applying
directly from an Ontario high school (known as “101” students) and delayed entry and/or nonOntario high school students (known as “105” students). The 101 set of applications capture
information on the students’ performance in level 4 (grade 12) courses in high school and their
home postal code. From the 105 set of applicants the high school grade information was more
limited. The location of their residence and home postal code (if from Canada), however, were
available. Using the home postal code measures from the fourth data set, namely the socioeconomic characteristics of the neighbourhood where the student’s family resided, were added
from the 2006 Canadian census. The census geography utilized was the dissemination area, a
geography that covers roughly 500 households.
The core sample studied in this report is those students enrolled in the course and for
whom we observe both information from their university application and they received a final
grade. We, therefore, include in our sample a total of 4,384 students. Some of these students,
however, are observed repeating the course, leaving us with a total 4,342 unique students that are
studied. The enrollment in the fall term is lower than the winter term given there was one fewer
section of the course offered.
Table 1: Sample comparison with the student
population
Sample
students
All(1) students
at university
under study
All(2)
Ontario
students
(1)
(2)
(3)
4,342
9,541
130,282
64.7%
47.0%
44.1%
78.3%
9.1%
12.6%
88.5%
7.0%
4.6%
90.9%
5.7%
3.4%
77.5%
12.7%
9.8%
88.6%
6.7%
4.7%
90.8%
4.8%
4.4%
65.9%
75.5%
76.9%
60.9%
67.9%
65.8%
Students with ON postal code (3)
4,024
9,510
129,925
% living in low income neighbourhood
High School GPA (Best 6 University or Mixed
Courses)
Average Best 6 GPA
(standard deviation)
13.2%
14.5%
20.3%
86.5%
(5.0)
86.6%
(5.5)
83.7%
(6.5)
Total students in sample
Delayed or non-Ontario students (% 105
students)
Gender (% male)
11.8%
Immigrant status
Canadian Citizen
Permanent Resident
Other
Years in Canadian K-12 school system
6 years or more
3-5 years
2 years or less
% with English reported as their primary
language
Indication on application that an application for
financial aid was submitted (% applied)
(Note: Differences of group means between columns 1 and 2 are not statistically different from
zero. The differences in means between the university under study and all Ontario students,
however, is statistically different at the 1% confidence level)
5th percentile of best 6
Median best 6
95th percentile of best 6
Students by Program of Registration, Year 1
78.3%
86.2%
95.5%
77.2%
86.7%
95.7%
72.7%
83.8%
94.2%
28.8%
28.2%
11.8%
20.3%
14.1%
9.3%
(4)
Commerce (includes Social Sciences)
Engineering
Science/Health
Humanities
Other
22.6%
15.5%
5.0%
32.7%
30.2%
5.1%
28.9%
39.0%
8.7%
NOTES:
(1) All students who applied directly from HS in 2011 and 2012 and were registered at
McMaster. There are a few students that are observed taking the course in more than one term.
(2) All students who applied directly from HS in 2011 and 2012 and were registered at any
University.
(3) OUAC 101 students (direct applicants) may have a non-Ontario postal code if they attend an
International Ontario approved High School. For schools located in Ontario, if student postal
code was invalid, school postal code was used.
(4) Program of registration is based on transcript data for column 1 and
application data for columns 2 and 3
In Table 1 the characteristics of the students under study were compared with the
characteristics of the 101 students that first enrolled in the university under study in 2011 or 2012
(column 2) and all Ontario direct-entry students observed registering at an Ontario university in
2011 or 2012. As a portion of the students in the course are registered students at a higher level
(level 2 mostly), we compare the entering characteristics of these students based on the
information provided in their application. As shown in Table 1, there are more males in the
course than at the university and in the entire system. There are fewer students who are
Canadian citizens, spent 6 or more years in the Ontario K-12 system, and who report English as
their primary language in the course than at the university or in the province. A lower proportion
of the students indicated that they applied for financial aid on their university application form
and of those with an Ontario postal code, there are a lower proportion of the students whose
family address is located in a low-income neighbourhood (bottom tercile of neighbourhoods).
There are small but not statistically significant differences in entering high school averages with
the students under study reporting slightly lower entering averages relative to all students at the
university under study but with these students reporting higher averages than that reported for all
registrants. Finally, the students in the study more heavily represent commerce and engineering
than science and humanities. Thus, overall the students under study are more likely to be foreign
born, male, from higher income families, and more interested in commerce and engineering.
Table 2: Demographic Characteristics of Sample Students: Comparison Across Terms
Fall Term (Traditional
Tutorial)
Control
(2011)
(1)
704
62.2%
Treatment
(2012)
(3)
861
63.8%
Total students per term
Gender (% male)
Immigrant Status
Canadian Citizen
77.4%
78.8%
Permanent Resident
7.7%
9.2%
Other
14.9%
12.1%
Students by number of years in Canadian school system (at
admission)
6 years or more
76.3%
77.6%
3-5 years
15.2%
12.7%
<= 2 years
8.5%
9.8%
% with English reported as their
64.6%
65.4%
primary language
Students by OSAP application status
Applied for OSAP (% applied)
56.0%
63.9%
Students by neighbourhood of residence (if residing in Ontario at
admission)
Students with ON postal code
643
800
% living in low income neighbourhood
12.3%
14.8%
(bottom tercile)
% living in mid-income neighbourhood
32.8%
33.4%
(middle tercile)
% living in high income neighbourhood
54.9%
51.9%
(top tercile)
Median distance to the University, km
High School GPA (Best 6 University or Mixed
Courses)
47.6
47.0
Winter Term
(Collaborative
Learning)
Control
Treatment
(2012)
(2013)
(2)
(4)
1,454
1,365
66.2%
65.1%
76.8%
10.8%
12.4%
79.3%
8.1%
12.6%
76.1%
12.9%
11.0%
78.8%
11.7%
9.6%
65.3%
67.1%
61.6%
60.2%
1,356
1,259
14.6%
14.9%
31.1%
31.2%
54.4%
53.9%
46.9
48.7
Average Best 6
Standard Deviation of Best 6
Grade difference with Fall 2011 students
85.6
(4.6)
86.3
(4.7)
0.7
(0.3)***
86.4
(4.9)
Grade difference with Winter 2012
students
87.2
(5.3)
0.8
85.8
(0.2)***
86.7
1263
97.8%
86.9%
1166
97.4%
85.4%
29
67
794
402
162
87.6
(7.9)
89
82.0
(31.3)
31
64
749
353
168
88.0
(8.4)
90
80.0
(33.2)
26.71%
14.98%
24.85%
28.69%
3.72%
1.05%
28.27%
11.83%
31.64%
24.97%
2.89%
0.41%
24.25%
15.31%
28.57%
27.69%
3.66%
0.51%
Share students registered in Level 1
73.15%
68.64%
Concurrent or Past Enrollment in First Year Microeconomics
Course
Number of students
228
327
Mean performance in microeconomics
7.9
8.0
course (scale of 12)
(standard deviation)
(3.1)
(3.1)
79.78%
76.12%
1,305
1,159
7.5
8.4
(3.0)
(3.1)
Median Best 6
85.3
85.9
High School Math (average of all grade 12 University stream
courses)
# students with at least one Math course
598
729
(% 101 students)
97.1%
96.2%
(% of all students)
84.9%
84.7%
# level 12 Math courses taken
0
18
29
1
44
56
2
372
432
3 or 4
182
241
(no HS course information )
88
103
Average of Best Math grade
86.1
86.9
(standard deviation)
(8.7)
(8.6)
Median Best Math grade
87
89
Performance in Course Online Math Test
81.9
79.4
(standard deviation)
(30.4)
(33.0)
Students by Enrolled Faculty At Time
of Course
Business
Social Sciences
Engineering
Science/Health
Humanities
Not Declared or Not Applicable
38.64%
14.20%
23.15%
19.74%
3.84%
0.43%
Overall Term Performance
What was the total # credits taken in
the Term
Average # of credits
(standard deviation)
Minimum credits
Median credits
Max credits
Share of students with "full time loads"
13.2
(3.4)
3
12
21
79.12%
13.6
(3.3)
3
15
21
82.58%
13.9
(3.5)
3
15
21
83.22%
13.8
(3.6)
3
15
21
81.10%
Do our students vary across terms? Table 2 shows the characteristics of our study
students grouped by term of enrollment in the course. Across the terms there are marginal
differences in most of the measures. The core differences are the proportion of students who are
male (more in the winter terms), and the proportion residing in low-income neighbourhoods
(smaller proportion in fall 2012). In terms of preparation, the students in the winter term have
slightly higher high school GPAs and higher averages for the best level 4 math course in high
school. Presumably these differences are driven by the fact that a greater share of students are
enrolled in the engineering faculty in the winter term and engineering students typically enter the
university with better grades due to high admission standards and are also more math oriented,
an important factor in the course under study.
IV.
Analysis
a. Tutorial Attendance
The first set of analyses focuses on participation in tutorials. The structure of the
introduction of tutorials was to use a quasi-experimental design given the students choose to take
the course and the requirement that we offer tutorials to all students in a given term. Offering
tutorials and participating in tutorials, however, are not identical. Given a core justification for
introducing tutorials is to engage students, especially those likely to face difficulties in the first
year of university studies, a first analysis focuses on participation in the tutorials and developing
a better understanding of who participates as well as the extent to which a student participates in
the tutorials throughout the term.
Figure 1: Tutorial Attendance
Figure 1 depicts the attendance of tutorials. In both terms, a high share of students
attended at least 1 tutorial with most students (~70%) attending at least 3 tutorials. In each term,
we observe a decrease in attendance, approximately a 5 percentage point drop from attending 1to
2 tutorials and from attending 2 to 3 tutorials and an approximately 10 percentage point drop
from attending 3 to 4 tutorials. The biggest drop off in tutorial participation is from 4 to 5
tutorials with there being a 17-19 percentage point drop in participation. A key component to
tutorial attendance is participation in the first and/or second tutorial. If a student failed to attend
the first or second tutorial, the student was not observed attending the remaining tutorials. In the
fall term, there were 168 students that did not attend at least one of the first two tutorials. Less
than 10 percent of these students (14) were observed attending a future tutorial. Similarly, in the
winter term, there were 299 students that failed to attend at least one of the first two tutorials.
Less than 4 percent of these students (9) were observed attending a subsequent tutorial.
To what extent is tutorial attendance correlated with being more academically prepared?
While there is no perfect measure of academic preparation, we can look at the relationship
between performance in high school, as measured by the best level 4 (grade 12) math grade and
tutorial attendance. As mentioned in the introduction, for the course under study, math skills are
required for successful completion of the course. In Figure 2 we group tutorial participation by
the number of tutorials a student attends and depict these students based on whether their best
level 4 math grade is above or below the median math grade for the students in the course.
Across the board, it appears that approximately half of the students fall above and half of the
students fall below the median. So there is not prima facie evidence that only the better
academically prepared (as measured by math grades) students are attending the tutorials.
Figure 2: Tutorial Attendance and Grade 12 Math Performance
Are there other observed characteristics that are correlated with tutorial attendance? In
Table 3, we report the results from a linear probability regression that uses as a dependent
variable a zero/one indicator variable to indicate whether the student attends at least one tutorial
as well as the number of tutorials attended. In columns 1 to 4 we report the results for the fall
term and in columns 5 to 8 we report the results for the winter term. In the odd columns we use
as a measure of preparedness for the course the score on the best high school level 4 math
courses and the square of its term. In the even columns we use as a measure of preparedness
performance on the online math test offered in the first two weeks of the course and the square of
its term. For students with either no high school math mark or no in course online math mark we
assign a value equal to the mean of the respective mark and include indicator variables equal to 1
if the student does not have the respective mark. The omitted category is not having taken the
test.
Table 3: Likelihood of Attending a Tutorial
Dependent Variable: Attend At Least 1 Tutorial
Best High School Math Mark
Best High School Math Mark Squared (/100)
Below Median on Best HS Math Mark
No HS Math
No Transcript
In Course Online MathMark
In Course Online MathMark Squared (/ 100)
Below Median on Online Math Mark
No Online Math Mark
Level 1 Student
Full Time Student
Student = Female
Enrolled in Business Faculty
Enrolled in Engineering Faculty
Enrolled in Science or Health Science
Fall 2012
(1)
(2)
(3)
(4)
Attended At Least 1
# of Tutorials
Tutoria1
Attended
-0.03**
-0.14**
(0.01)
(0.06)
0.02**
0.09**
(0.01)
(0.04)
-0.01
-0.07
(0.05)
(0.25)
1.10*
6.35**
(0.58)
(2.89)
1.18**
6.42**
(0.58)
(2.93)
0.48
3.62
(0.77)
(3.27)
-6.16
-121.31
(59.68)
(270.13)
-0.03
-0.25
(0.04)
(0.22)
-0.21***
0.93***
(0.05)
(0.28)
0.16*** 0.15*** 0.72*** 0.67***
(0.03)
(0.03)
(0.17)
(0.16)
0.02
-0.01
0.46*** 0.35**
(0.03)
(0.03)
(0.16)
(0.16)
0.01
0.01
0.41*** 0.41***
(0.03)
(0.02)
(0.14)
(0.13)
0.03
0.01
0.44**
0.33*
(0.04)
(0.03)
(0.21)
(0.19)
0.18*** -0.17*** 0.68*** 0.66***
(0.06)
(0.05)
(0.24)
(0.22)
0.03
0.01
0.18
0.05
(0.05)
(0.04)
(0.23)
(0.21)
Winter 2013
(5)
(6)
(7)
(8)
Attended At Least
# of Tutorials
1 Tutoria1
Attended
-0.01
-0.17*
(0.02)
(0.09)
0.01
0.11**
(0.01)
(0.05)
-0.00
0.02
(0.03)
(0.19)
0.38
6.57*
(0.80)
(3.95)
0.66
7.81**
(0.79)
(3.90)
-0.01
-1.04
(0.57)
(2.54)
8.48
154.68
(45.51)
(210.70)
-0.03
-0.13
(0.03)
(0.17)
0.23***
-1.37***
(0.04)
(0.17)
0.18*** 0.16*** 0.93*** 0.82***
(0.03)
(0.03)
(0.13)
(0.13)
0.10*** 0.08*** 0.38***
0.25*
(0.03)
(0.02)
(0.14)
(0.14)
0.06*** 0.06*** 0.37*** 0.36***
(0.02)
(0.02)
(0.09)
(0.09)
-0.01
0.01
0.12
0.13
(0.03)
(0.03)
(0.18)
(0.15)
-0.04
(0.04)
0.06
(0.04)
-0.02
(0.03)
0.06*
(0.03)
-0.04
(0.17)
0.39**
(0.19)
0.04
(0.16)
0.37**
(0.15)
Enrolled in Humanities
Canadian Citizen
Language Spoken at Home is not English
Indicated Applied for Financial Aid
Age <= 17
Age = 19
Age = 20+
Neighourhood Share of Unemployment for individuals 25+
Neighbourhood Income in Bottom Tercile
Neighbourhood Income in Middle Tercile
Total Population in Neighbourhood (/1000)
Share of the Population Aged 15-24
Share of Families = 1 Parent
Share of Population with High School Degree
Share of Population with University or Higher Degree
Share of Population = Visible Minority
Family Residence in Neighbourhood not in Canada
Constant
0.00
(0.07)
-0.01
(0.04)
-0.02
(0.03)
0.06**
(0.03)
0.00
(0.14)
0.01
(0.03)
0.04
(0.05)
1.73
(1.46)
-0.03
(0.06)
-0.02
(0.03)
0.15
(0.67)
-0.61
(0.78)
-1.11
(1.77)
0.91
(0.93)
0.36
(0.26)
-0.07
(0.08)
0.08
(0.06)
2.43
(1.87)
0.02
(0.06)
-0.03
(0.04)
-0.02
(0.03)
0.05*
(0.03)
0.03
(0.13)
0.02
(0.03)
0.04
(0.04)
2.00
(1.35)
-0.06
(0.06)
-0.03
(0.03)
-0.28
(0.65)
-0.18
(0.72)
-0.85
(1.72)
0.91
(0.83)
0.31
(0.22)
-0.07
(0.07)
0.04
(0.03)
0.88
(1.76)
0.08
(0.35)
-0.27
(0.19)
-0.18
(0.15)
0.42***
(0.13)
0.22
(0.45)
0.05
(0.15)
0.09
(0.25)
11.19
(7.59)
-0.37
(0.29)
-0.27
(0.17)
1.44
(3.69)
-4.40
(4.49)
5.56
(9.25)
5.77
(4.76)
1.32
(1.31)
-0.07
(0.42)
0.40
(0.32)
0.55
(9.98)
0.14
(0.36)
-0.41**
(0.20)
-0.12
(0.14)
0.37***
(0.13)
0.32
(0.47)
0.07
(0.14)
-0.00
(0.24)
13.09*
(6.94)
-0.52*
(0.28)
-0.34**
(0.16)
-0.75
(3.56)
-2.35
(4.23)
6.93
(9.03)
5.70
(4.28)
1.03
(1.15)
-0.07
(0.38)
-0.08
(0.16)
-8.35
(9.53)
-0.10
(0.08)
-0.05
(0.03)
0.04
(0.03)
0.01
(0.02)
-0.00
(0.06)
0.01
(0.03)
-0.05
(0.05)
-0.64
(1.27)
-0.05
(0.04)
-0.02
(0.03)
-0.07
(0.45)
-0.32
(0.55)
0.77
(1.37)
0.02
(0.73)
-0.05
(0.23)
0.06
(0.07)
0.01
(0.05)
0.35
(1.64)
-0.14*
(0.08)
-0.06*
(0.03)
0.02
(0.03)
0.02
(0.02)
-0.01
(0.06)
0.02
(0.03)
-0.02
(0.04)
-0.40
(1.24)
-0.06
(0.04)
-0.02
(0.03)
-0.06
(0.46)
-0.49
(0.52)
0.97
(1.41)
0.25
(0.70)
-0.02
(0.21)
0.06
(0.06)
0.01
(0.03)
-0.37
(1.44)
-0.20
(0.34)
-0.12
(0.17)
0.16
(0.16)
0.02
(0.11)
0.34
(0.31)
0.12
(0.14)
-0.34
(0.22)
-2.96
(6.18)
-0.22
(0.21)
0.05
(0.15)
3.47
(2.51)
-0.63
(2.62)
9.82
(6.13)
1.35
(3.51)
-0.52
(1.07)
0.39
(0.34)
0.18
(0.24)
-1.91
(7.28)
-0.32
(0.32)
-0.18
(0.16)
0.09
(0.14)
0.07
(0.10)
0.33
(0.31)
0.21
(0.13)
-0.23
(0.18)
-1.69
(5.89)
-0.29
(0.21)
0.05
(0.14)
3.36
(2.54)
-1.49
(2.50)
11.77*
(6.18)
2.42
(3.31)
-0.43
(0.99)
0.39
(0.32)
0.07
(0.13)
-10.23
(6.34)
Observations
R-squared
Robust standard errors clustered by neighbourhood in
parentheses
*** p<0.01, ** p<0.05, * p<0.1
852
0.23
852
0.27
852
0.23
852
0.27
1,358
0.34
1,358
0.38
1,358
0.29
1,358
0.34
Focusing first on tutorial attendance in the fall term, the results suggest a negative
correlation between the best high school mark and tutorial attendance but a positive correlation
between the square term of the best high school mark. Across most marks, however, this
suggests only a slight decline in attendance based on math marks. For most of the measures, the
coefficient is not significantly different from zero. There is a higher likelihood of tutorial
attendance by level 1 students, female students, full time students, and students that indicated
they were applying for financial aid. Compared to students in Social Science, students in
business are more likely to attend tutorials, while those in engineering are less likely to attend.
Moving next to the results for the winter term, we observe similar results but with different
magnitudes. For example, a level 1 student in the fall is more likely to attend an additional 0.7
tutorials whereas a level 1 student in the winter is more likely to attend an additional .9 tutorials.
In addition, the coefficient on the measure for applying for financial aid is not significantly
different from zero in the winter term. A student’s faculty plays a different role in the winter
term. Business and engineering students are no more or less likely than social sciences students
to attend, while science students are more likely and humanities students less likely to attend.
Overall, while there are some differences based on observed characteristics in the
likelihood of attending a tutorial, the evidence does not overwhelmingly support the notion that
less prepared students do not attend the tutorials. This, in part, is likely attributable to the
relatively high participation rates in the tutorials.
b. Test Performance
Does offering tutorials matter? We turn next to observed performance on the two term
tests. In Table 4 we report the average performance on the two term tests across the four terms.
As we demonstrated above, there are differences in the composition of students across terms.
We, therefore, compare the students in tutorials based on tutorial type (traditional v.
collaborative) and enrollment term. Beginning first with the fall term (columns 1 and 2), average
performance on term test 1 is similar across the two years. Average student performance on term
test 2 is higher in the term with tutorials but average student performance on the final exam is
lower. Stronger performance on test 2 is counter to what has been observed historically in this
course (that performance is lower on test 2), raising some suggestion that possibly the tutorials
had an effect on understanding the concepts and, thus, test 2 performance. The decline in the
final exam is somewhat puzzling. Recall, however, that the weight attributed to a student’s final
exam is shifted to performance on the best test if the student attended tutorials. Thus, this
decline in performance on the final exam might reflect a strategic decision by the students if
students seek minimum grades versus maximum knowledge. Without more information,
however, we cannot explore such a hypothesis.
Turning next to the winter terms, average performance on the term tests and final exams
is lower in the term with the collaborative tutorials. This is perplexing. Recall, however, that
there are lower shares of students enrolled in engineering and business programs in Winter 2013
versus Winter 2012. Also, there is a lower share of students that took 2 or more grade 12 math in
the 2013 versus 2012 term.
Table 4: Average Student Performance on Term Tests and Final Exam
Fall Term
(Treatment=Traditional Tutorial)
Total students in sample
Control
(2011)
Treatment
(2012)
704
861
71.8
(14.1)
63.9
(16.5)
68.7
(15.8)
71.7
(15.2)
68.0
(20.8)
62.8
(14.2)
Fall v. Fall
Treatment
- Control
# of students repeating the course
Average Test 1 (0-100)
(standard deviation)
Average Test 2 (0-100)
(standard deviation)
Average Final Exam (0-100)
(standard deviation)
-0.1
4.1
-5.9
Winter Term
(Treatment=Collaborative Learning)
Control
(2012)
Treatmen
t (2013)
1454
1365
10
8
76.4
(13.6)
72.4
(16.5)
65.0
(15.0)
72.8
(18.1)
65.4
(19.5)
61.4
(14.8)
Winter v.
Winter
Treatment Control
-3.6
-7.0
-3.6
In Table 5, we report the regression results using performance on the terms tests as the
dependent variable. In columns 1 and 2, performance on term test 1 is the dependent variable.
Tutorials 1 and 2 included material covered on this test, so we explore the effects of attendance
in either or both of these tutorials on test performance, after controlling for background
characteristics of the students and the neighbourhoods in which their parents reside. The results
in column 1 reflect overall tutorial participation during the 2012/13 school year. The results in
column 2 allows for a differential effect of the collaborative learning tutorial and also explores
whether there is a differential effect for students that are observed with high school marks falling
below the median of the students enrolled in the course.
Overall, there is a positive effect of tutorial participation on test performance.
Participation in one or both tutorials increases performance on the test an average of 2 to 2.5
percentage points. There is, however, no discernable difference of the collaborative learning
form of tutorial on performance. The coefficient on the interaction term for the tutorial
participation and having a below median math mark is negative but imprecisely measured.
There is also a strongly positive effect of tutorial participation on term test 2 (columns 3
and 4). For this test, the material covered in tutorials 3 and 4 were most relevant. Given we
observe a difference in participation in these two tutorials, we include separate measures for
participating in the two tutorials. Overall, participating in tutorial 3 increased performance on the
test by 2.2 percentage points, overall, and 3.7 percentage points for the traditional tutorials.
Participating in both tutorials, increased performance an average of between 7 and 9.2
percentage points. Although not always precisely measured, there is some evidence that the
collaborative learning tutorials did not increase performance to the same degree that the
traditional tutorials did.
Table 5: Effects of Tutorials on Term Tests
Dependent Variable
Attended Tutorial #1 and/or Tutorial #2
* Collaborative Learning
* Below Median High School
Average (Best 6 Grade 12)
Attended Tutorial #3
* Collaborative Learning
* Below Median High School
Average (Best 6 Grade 12)
Attended Tutorial #4
* Collaborative Learning
* Below Median High School
Average (Best 6 Grade 12)
Best High School Average (Best 6
Grade 12)
Below Median High School Average
(Best 6 Grade 12)
No High School Average
Level 1 Student
Full Time Student
Student = Female
Enrolled in Business Faculty
Enrolled in Engineering Faculty
Enrolled in Science or Health
Science
Enrolled in Humanities
Additional Controls
Test 1
(1)
2.25***
(0.53)
Test 2
(2)
2.43***
(0.54)
0.3
(0.36)
-0.69
(0.45)
(3)
2.13*
(1.13)
(4)
3.67**
(1.77)
-2.24
(1.99)
-0.39
(1.85)
4.93***
5.60***
(1.06)
(1.62)
-3.54*
(1.94)
2.75
(1.86)
1.14***
1.14***
1.17***
1.22***
(0.09)
(0.09)
(0.09)
(0.08)
-0.07
0.41
0.58
0.32
(0.74)
(0.87)
(0.67)
(0.81)
2.88**
3.01***
3.66**
3.52**
(1.13)
(1.13)
(1.42)
(1.44)
-4.23***
-4.26***
-3.74***
-3.54***
(0.54)
(0.55)
(0.89)
(0.88)
0.69
0.75
1.48**
1.39*
(0.52)
(0.52)
(0.73)
(0.75)
-1.59***
-1.60***
-0.36
-0.36
(0.50)
(0.50)
(0.59)
(0.58)
4.17***
4.14***
2.04*
1.83
(0.80)
(0.80)
(1.16)
(1.19)
4.18***
4.12***
1.71
1.73
(0.85)
(0.85)
(1.13)
(1.14)
7.06***
7.00***
5.83***
5.73***
(1.05)
(1.05)
(1.69)
(1.69)
0.53
0.52
-2.11
-2.13
(1.51)
(1.51)
(1.84)
(1.84)
Neighbourhood Neighbourhood Neighbourhood Neighbourhood
SocioSocioSocioSocioeconomic
economic
economic
economic
measures
Observations
4,359
R-squared
0.27
Robust standard errors clustered by neighbourhood
reported in parentheses
*** p<0.01, ** p<0.05, * p<0.1
measures
measures
measures
4,359
0.27
4,359
0.20
4,359
0.20
c. Final Exam and Overall Course Performance
Do the tutorials affect the final exam and performance in the course? In table 6 we report
the results using the final exam marks and the overall course grade as dependent variables. For
all specifications, we include indictor variables for participating in at least 1 tutorial, the total
number of tutorials attended and interaction terms between the number of tutorials attended and
the collaborative tutorial and being below the median on high school marks. In column 1 we use
as a dependent variable the final exam marks. In columns 2 and 3 we use as a dependent
variable the overall grade in the course. In column 2 the overall grade includes marks attributed
for undertaking the online homework and participating in tutorials. In column 3, the overall
grade excludes marks assigned for these two items, thus only accounting for performance on the
two term tests and the final exam.
Overall, attending a single tutorial has no, or a relatively small negative, effect on final
exam and overall course performance. There is an increasingly positive effect of attending
several tutorials on these measures of performance. The effect of the collaborative learning
tutorials, however, is smaller than the traditional tutorials. For example a student who attended
all five traditional tutorials, improved her final exam mark by 5.7 percentage points and her
overall grade (adjusted) by 6.5 percentage points. If the same student attended all five
collaborative tutorials, her final exam mark improved by 2 percentage points and her overall
grade increased by 3.4 percentage points. There is no discernable effect of the tutorials on the
performance of students that entered the course at the lower end of the distribution based on high
school marks.
Any grade difference of more than three percentage points is meaningful to the students
in the sample. The university under study uses a twelve point grading system to calculate
cumulative averages, where an “F” equals zero, “D-” equals one, “D” equals two, up to “A+”
equaling twelve. For grades in the “D”, “C”, and “B” ranges, an increase in grade of three
percentage points is enough to move a student up to the next grade level. For example, a student
with a grade of 67% has a C+, which translates to six out of twelve, while a student with 70%
has a B-, which is seven out of twelve.
Table 6: Effect of Tutorials on Final Exam and Overall Course Grade
Dependent Variable
Ever Attend A Tutorial
Number of Tutorials Attended
* Collaborative Learning
* Below Median High School Average (Best 6
of Grade 12)
Best High School Average (Best 6 Grade 12)
Below Median High School Average (Best 6
Grade 12)
No High School Average
Level 1 Student
Full Time Student
Final Exam
Overall Grade:
Unadjusted
Overall Grade:
Adjusted
(1)
-1.64
(1.13)
1.47***
(0.26)
-0.75***
(0.16)
-0.27
(0.17)
1.24***
(0.06)
0.33
(0.57)
3.91***
(1.33)
-4.94***
(0.62)
-0.00
(2)
-0.68
(0.99)
1.61***
(0.22)
-0.48***
(0.13)
-0.03
(0.14)
1.07***
(0.05)
0.22
(0.50)
3.30***
(0.99)
-3.93***
(0.49)
0.60
(3)
-1.59
(1.11)
1.62***
(0.25)
-0.65***
(0.13)
-0.12
(0.15)
1.20***
(0.06)
0.35
(0.52)
3.64***
(1.08)
-4.50***
(0.54)
0.44
(0.52)
-1.42***
(0.38)
Enrolled in Business Faculty
3.17***
(0.79)
Enrolled in Engineering Faculty
2.45***
(0.84)
Enrolled in Science or Health Science
7.17***
(1.23)
Enrolled in Humanities
-2.60
(1.60)
Neighbourhood
SocioAdditional Controls
economic
measures
Observations
4,359
R-squared
0.34
Robust standard errors clustered by neighbourhood reported in
parentheses
*** p<0.01, ** p<0.05, * p<0.1
Student = Female
(0.43)
-1.09***
(0.33)
3.70***
(0.62)
2.80***
(0.62)
7.15***
(0.94)
-1.51
(1.32)
Neighbourhood
Socioeconomic
measures
4,359
0.39
(0.45)
-1.27***
(0.36)
3.06***
(0.77)
2.68***
(0.75)
6.83***
(1.16)
-1.81
(1.41)
Neighbourhood
Socioeconomic
measures
4,359
0.38
V.
Discussion
Participation rates in the optional tutorials were quite high, with almost seventy percent
attending at least three out of the five tutorials offered. Students who did not attend one of the
first two tutorials were unlikely to attend any of the remaining tutorial sessions. First year
students, females, and students applying for financial aid are more likely to attend tutorials.
To the extent that these are high risk groups for withdrawing from a course, or the university,
these findings support the idea of offering tutorials. Countering that, we find that students
from high income neighbourhoods are also more likely to attend tutorials.
Term test performance is, on average, improved by tutorial participation. Attending a
single tutorial does not improve final exam or overall course performance, but attending
more than one tutorial has a cumulative effect on exam and overall performance. A student
who attends all five tutorials will likely improve their course grade by two full points on a
twelve-point grade scale.
Traditional tutorials appear to help students more than collaborative learning tutorials,
which was not the expected. It may be that the tutorials were too large for the collaborative
learning method to be effective. Each teaching assistant had almost seventy students per
tutorial section; a size that may be better suited for the more traditional tutorial style. It may
also be possible that having students in the traditional tutorial attempt the problem set ahead
of the tutorial positively influenced performance in the course. With tight budgets and large
section sizes, the traditional tutorial may be an effective means of using limited resources
toward student academic success.
We are also disappointed that students at the lower end of the distribution were not
helped more by adding tutorials to the course. The research so far is mixed on this issue with
some studies finding that tutorials have a greater impact on weaker student performance,
while others do not (Felder, Huynh, Jacko-Chaves, and Self, 2010b). Overall, we found that
student participation in tutorials helped performance in the course, and believe that these
gains in learning are worth the cost of offering tutorials.
References
Chemers, M. M., Hu, L., Garcia, B. F. “Academic self-efficacy and first year college student
performance and adjustment.” Journal of Educational Psychology, 93(1), 55-64.
Chickering, A., Gamson, Z. (1987) Seven principles for good practice in undergraduate
education, American Association of Higher Education Bulletin, 39 (7) 3-7.
Dooley, Martin, A. Abigail Payne and A. Leslie Robb. 2011. “Understanding the Determinants
of Persistence and Academic Success in University: An Exploration of Data from Four
Ontario Universities.” Toronto, ON: Higher Education Quality Council of Ontario.
Felder, R.M. (1995) A longitudinal study of engineering performance and retention. IV.
Instructional methods. Journal of Engineering Education, 84(4) 361-367.
Felder, R.M., Felder, G.N., and Dietz, E. J. (1995) A longitudinal study of engineering
performance and retention. V. Comparisons with traditionally taught students. Journal
of Engineering Education, 87(4) 469-480.
Finnie, Ross and Theresa Qiu. 2008. “Is The Glass (or Classroom) Half-Empty or Nearly Full?
New Evidence on Persistence in Post-Secondary Education in Canada.” In Who Goes?
Who Stays? What Matters? Accessing and Persisting in Post-Secondary Education in
Canada eds. R. Finnie, R.E. Mueller, A. Sweetman and A. Usher. Montreal and Kingston:
McGill-Queens University Press.
Finnie, R., Qiu, H. (2008) The patterns of persistence in post-secondary education in Canada:
evidence from the YITS-B dataset. A MESA Project Research Paper. Toronto, ON:
Educational Policy Institute.
Huynh, K. P., Dt. T. Jacho-Chavez, and J. K. Self. 2010a. The efficacy of Collaborative
Learning Recitation Sessions on Student Outcomes. American Economic Review,
100(2), 287-91.
Huynh, K. P., Dt. T. Jacho-Chavez, and J. K. Self. 2010b. The distributional efficacy of
Collaborative Learning Recitation Sessions on Student Outcomes. JSM Proceedings,
Alexandria, VA: American Statistical Association, 4406-4418.
Nagda, B. A., Gregerman, S. R., Jonides, J., von Hippel, W., Lerner, J. S. (1998)
Undergraduate student-faculty research partnerships affect student retention, The Review
of Higher Education 22 (1) 55-72.
Springer, L., M. E. Stanne, and S. S. Donavan. 1999. Undergraduates in Science,
Mathematics, Engineering, and
Technology: A Meta-analysis. Review of
Educational Research, 69(1): 21-51.
Yamarik, Steven. 2007. Does Cooperative Learning Improve Student Learning Outcomes?
Journal of Economic Education, 38(3): 259-77.
Download