The SCALE Efficiency Projects

advertisement
JALN Volume 2, Issue 2 - September 1998
The SCALE Efficiency Projects
Lanny Arvan*
John C. Ory**
Cheryl D. Bullock**
Kristine K. Burnaska**
Matthew Hanson**
* Sloan Center For Asynchronous Learning Environments, University of Illinois at UrbanaChampaign and
**Office of Instructional Resources, University of Illinois at Urbana-Champaign
Correspondence on this paper should be sent to:
Lanny Arvan, Director
SCALE
1406 West Green Street
Urbana, Illinois 61801-2291
Phone: 217-333-7054
Email: l-arvan@uiuc.edu
ABSTRACT
This paper presents evidence from nine “Efficiency Projects” that were SCALE’s focus in the
1997-98 academic year. The Efficiency Projects were specifically aimed at using ALN to achieve
higher student/faculty ratios, without sacrificing instructional quality. The study concentrates on
data amassed for the fall 1997 semester. Evidence was collected on the cost side, for ALN
development and delivery, and the performance/attitude side, from both student and faculty
perspectives. The study supports the view that when a sensible pedagogic approach is embraced
that affords the students with avenues to communicate about their learning, ALN can produce real
efficiency gains in courses without sacrificing the quality of instruction.
KEY WORDS
Efficiency Projects
Quality of learning
I. INTRODUCTION
This paper presents evidence from nine “Efficiency Projects” that were SCALE’s focus in the
1997-98 academic year. The study concentrates on data amassed for the fall 1997 semester.
Evidence was collected on the cost side, for ALN development and delivery, and the
performance/attitude side, from both student and faculty perspectives.
The Efficiency Projects were specifically aimed at using ALN to achieve higher student/faculty
ratios, without sacrificing instructional quality. The higher student/faculty ratios occurred in
some cases by increasing the number of students taught, in other cases by reducing the size of the
instructional staff. One common feature shared by these projects is class size. All were in large
33
JALN Volume 2, Issue 2 - September 1998
undergraduate classes. Another common feature was the reliance to some degree on automatic,
Web-based grading software. Yet there were substantial differences across these projects.
The courses were in Chemistry (General Chemistry and Advanced Organic Chemistry for
Biology majors), Circuit Analysis (Introductory), Differential Equations, Economics
(Microeconomics Principles and Intermediate Microeconomics), Microbiology (Introductory),
Spanish (Intermediate Grammar), and Statistics (Introductory for non-technically oriented
students). The Principal Investigators (PIs) on these projects differed in their experience in
teaching with ALN. Some were among the original SCALE grantees (and among these some had
significant relevant prior experience). Others had less experience. Indeed, the Spanish PI was a
relative computer novice and was using ALN for the first time. Some of the courses relied
heavily on graduate assistants. Others used undergraduate peer tutors. Some of the courses used
asynchronous conferencing primarily as a means for providing help to students. Others used
synchronous text-based chat for this purpose and used asynchronous conferencing as a means for
students to do written work online. In some cases the developers of the online materials were
also providing the face-to-face delivery of instruction. In other cases the authoring, presentation,
and coaching functions were separated across individuals. Some courses retained the traditional
lecture intact. Others substantially reduced face-to-face contact hours.
With all this variation, it is probably better for the reader to interpret the results as a collection of
case studies rather than as a cross section of evidence on ALN, viewed as a precisely pinpointed
approach to online instruction. We do try to draw some general conclusions where we think it
appropriate, both about ALN instruction in a large class setting and about using ALN to attain
efficiency ends.
A critical issue is the extent to which the findings presented in this paper are replicable. A big
part of the replicability question is attributing the results to ALN or to the PIs themselves. In the
vast majority of the Efficiency Projects, the PIs were early adopters with a great deal of
enthusiasm for online teaching. Whether the results translate to mainstream faculty remains an
open question. Another factor is the general computing environment. Computer technology
permeates daily life at the University of Illinois at Urbana-Champaign (UIUC) and thus it might
be equally important to ask whether the results would hold at a campus where computing is less
firmly imprinted into the culture. Yet another significant issue is the extent of up-front learning
needed, for the support organization as well as for the instructors, as a precursor to any program
aimed at utilizing ALN for efficiency ends. Whether others would require the two-year lead for
general ALN development, as did SCALE, is an open question. To give the reader more of
context for these and related issues, we briefly review how SCALE has grown, from its origins to
the present.
A. Brief SCALE History
SCALE was formed in spring 1995 with a $2.1 million grant from the Alfred P. Sloan foundation
and a generous match from the University of Illinois at Urbana-Champaign. The grant covered a
three-year period that ended after the spring 1998 semester. The goal was to bring 15 ALN
courses online a year. In fact, in the 1997-98 academic year there were approximately 80 courses
per semester supported by SCALE. These courses enrolled about 8000 students per semester.
The Efficiency Projects represent only a small number of the courses supported by SCALE, but
account for about half the enrollments.
SCALE’s primary mission was to support ALN course development in an on-campus setting.
Initially, Sloan had set four targets for this on-campus ALN to achieve. These were to improve
retention, to decrease time to degree, to demonstrate verifiable increases in student learning, and
34
JALN Volume 2, Issue 2 - September 1998
to lower the cost of instruction. Over time, these targets have been modified, based on the
experience with ALN and its implementation on the UIUC campus.
While there were some Web-based activities in SCALE courses at the outset, the bulk of the
initial ALN work entailed use of asynchronous conferencing. At the start there were two
asynchronous products supported, the now defunct PacerForum and the still popular FirstClass.
Both of these are client-server software. The user must have the client installed on the desktop
computer where the user is working. The client allows the user to access the server over the
network. It is the server where all the information is stored. In 1996-97 SCALE dropped support
of PacerForum but began to support Web-based conferencing, where the Web browser (initially
Netscape Navigator, later also Microsoft Internet Explorer) serves as the client. Web-based
conferencing allows for a more seamless movement between course-related Web materials and
the conferencing environment, a distinct advantage. In 1996-97 SCALE supported the product
WebNotes. In 1997-98, SCALE switched to WebBoard and will continue to support use of this
product in the upcoming year. In spite of the increasing popularity of the Web, many SCALE
faculty continue to use FirstClass. The reasons for this loyalty to FirstClass are many and varied:
1) they have had good success with it in the past, 2) it is what they know and don’t want to have
learn something else, and 3) they view the current Web-based alternative as inferior.
Increasingly, SCALE faculty have come to put their ALN materials on the Web. In addition to
the standard syllabus and lecture notes, simulations (primarily in science and engineering
courses) that had previously been delivered via dedicated client software were moved to the Web.
Moreover, after the pioneering projects in 1995-96, much effort was put into authoring questions
for CyberProf [1] and Mallard [2], products developed at UIUC for allowing students to selfteach via intelligent assessment of short-answer questions delivered through the Web. Without a
doubt, Web delivery became an increasingly important component of the teaching strategy in
SCALE-supported courses.
Apart from an evolution in the technology, there was also a transformation in the pedagogy. Over
time, the original grantees came to increasingly trust their ALN teaching approach. ALN became
less of an experiment and more an established style, with a heavy emphasis on the assignments
that students were to complete. This regularizing of ALN allowed SCALE to provide its support
in a consistent, well-prescribed manner. It also allowed grantees who got started in year two of
the grant and even more so in year three to get current with ALN teaching in an accelerated
manner. They had to learn the ALN software, to be sure, but there was less need to tinker with
the pedagogy and wonder if it would work.
There has been an independent evaluation team from the start of the SCALE project. That team
is headed by John C. Ory and includes Cheryl Bullock and Kristine Burnaska. They produced
semester-by-semester evaluations starting in fall 1995 and culminating in spring 1997
[3],[4],[5],[6]. Matthew Hanson joined the evaluation team in summer 1997, to work exclusively
on the Efficiency Projects. While the evaluation team has been in frequent contact with SCALE
administration and, in particular, the evaluation strategy of the Efficiency Projects was discussed
extensively, the actual data collection effort has been the sole province of the evaluation team.
This independence helped to minimize the chance of misrepresentation of the findings and to
reduce the awkwardness involved in the data collection, particularly in those cases when students
or faculty reported that things weren’t going so well.
B. Sources of Productivity Improvement
Studies of computer technology use aimed at increasing instructional productivity are quite rare.
The Rensselaer Studio Courses offer one example [7]. Some work done at Michigan State
35
JALN Volume 2, Issue 2 - September 1998
University by Ed Kashy, Michael Thoennessen, et. al., [8], [9], is closer in spirit to the present
study. That is essentially the entire list. That such work is indeed rare is confirmed by Polley
Ann McClure [10]. “While there are some cases in which we can document improved
educational output as the result of technology intervention, in a brief survey of the literature, I
could find no studies documenting improved educational output per unit cost. The educational
gains have been at huge cost, in terms of investment in both equipment and software, but more
significantly, in faculty and support staff time.” Similarly, David Noble [11], a notable opponent
of online education, cites the work of Kenneth Green [12] when arguing, “Recent surveys of the
instructional use of information technology in higher education clearly indicate that there have
been no significant gains in either productivity improvement or pedagogical enhancement.”
That such documentation is so rare suggests two potential explanations: (1) it is not possible to
generate productivity increases with computer technology, and (2) it is possible, but the
incentives are not right for us to witness them. Robert Koob [13] makes a convincing case for the
second hypothesis. Yet affirming that incentives are weak does not in itself prove that computer
technology can generate instructional productivity gains. More direct evidence is needed and that
provides the raison d’être for our study.
Strategic thinking about how instructional technology should be used for advancing productivity
ends has clearly outstripped the empirical work in this area. Much of this strategic thinking has
come out of Educom’s National Learning Infrastructure Initiative (NLII). Examples include the
papers by Carolyn Twigg [14], [15], D. Bruce Johnstone [16], and William F. Massy and Robert
Zemsky [17]. The ideas behind the SCALE Efficiency projects have been influenced by this
work. Yet it should be understood that making these ideas operational requires compromise, in
both the implementation and in the measurement. It is our hope that this paper gives the reader
some insight into the type of compromises that are needed to get actual productivity projects
underway and the variety of measurement problems that arise as a consequence.
Furthermore, there is a fundamental conceptual point that should be considered where the NLII
philosophy departs from the ALN philosophy. The basis of the NLII thinking is that educational
technology is capital and that any productivity gains must come as capital input substitutes for
labor input. While this capital for labor approach is not entirely absent in the ALN approach, it is
not the whole story. With ALN, much of the productivity increase comes from labor-for-labor
substitution – inexpensive student labor for expensive faculty labor. (The TLT Affiliate of
AAHE headed by Steven Gilbert and Stephen Ehrmann, [18], vigorously argues for more of this
type of labor-for-labor substitution, but to date they have concentrated their focus on the
instructional technology support arena rather than in the online classroom itself.) Viewing the
students’time as a productive input, as suggested by Lanny Arvan [19], some of this productivity
gain arises from peer-to-peer communication. (Note that we don’t cost-out this student time in
the measurement component of this paper, however, some demographic evidence suggests how
such a costing out should be done[20].) Additional productivity gains emerge from student
interaction with peer tutors who receive remuneration for the help they provide. In the ALN
approach, it is critical to view networked computers as, in part, communication tools. This allows
the ALN approach to make the instruction more personal while simultaneously increasing
productivity. At least, that is the ideal.
There has been a change of thinking within SCALE administration about how to deliver on the
Sloan objectives. During the first year of the SCALE project, there was an expectation that the
desired efficiency outcomes would come as a byproduct of ALN implementation. This due to the
enhanced peer-to-peer interaction and the avoidance of wasteful duplication of effort through the
instructor answering common student questions once, via posts to a public class conference. It
36
JALN Volume 2, Issue 2 - September 1998
was also expected that efficiency gains could be had in all ALN courses. In fact, most SCALEaffiliated faculty reported increased time involved in instruction as a long-term proposition,
because of the increased contact with the students online. Subsequently, some of these instructors
have modified their views about the need to be online so frequently for their students to have
good access. But the view remains that ALN teaching is arduous. Thus, it became apparent that
the byproduct approach would not achieve the desired results. Moreover, it also became clear
that efficiency outcomes would be difficult or impossible to attain in small ALN classes. There
were two reasons for this that perhaps should have been obvious at the outset of the project but
were not. First, if there was substantial up-front development in a small class, such development
could not be amortized over a large number of students. Second, in a small class there is very
limited opportunity to exploit labor-for-labor substitution. When SCALE administration
ultimately contracted for the efficiency projects [21], SCALE targeted large classes only.
Another consequence of abandoning the byproduct approach was the need to put in specific
incentives to produce efficiency outcomes. After the first year of SCALE, grants to PIs were
reduced so that more projects could receive funding. This trend was reversed for some of the
Efficiency Projects, which received grants that were as large as those grants given in the first
year. Moreover, SCALE was able to obtain assurances from the UIUC administration that any
savings produced could be retained within the department where those savings were generated.
C. Further Caveats
In the main, the SCALE Efficiency projects represent mature ALN development in large classes
where the ALN has now been focused on efficiency ends. There are many other ALN courses
that SCALE currently supports where no attempt is being made to produce efficiency outcomes.
Among these are some large classes. Thus, we are not arguing that large size per se makes a class
a good candidate for an efficiency project. For example, SCALE supports an introductory
comparative literature course that enrolls about 250 students a semester. The course is taught
with a lecture once a week. There are also small sections run by graduate assistants under the
supervision of the faculty member who delivers the lecture. The course is writing-intensive and
satisfies the campus Composition II requirement. In spite of the course size, the possibility for
capital substitution is limited here. Competent evaluators must assess the students’ written work.
Computer assessment of the writing is not possible, because the assessment is so contextually
based. It can’t be done via a search for key words. This requirement of competent assessment
also limits the possibility of labor-for-labor substitution in this course. We think that ALN is
improving learning, but we have no way to quantify the learning, so this course is not one of our
Efficiency Projects. There are also SCALE-supported courses currently taught in such an
inexpensive manner – large lecture with few if any graduate assistants to support the course – that
it seems foolhardy to try to further reduce the cost of instruction.
We are also not arguing that the SCALE approach can work everywhere, technological
considerations aside. The reliance on peer tutors, in particular, requires highly able students who
can serve in this capacity and feel they are doing something socially beneficial in the process.
The SCALE approach likely can work well at other institutions in the Big Ten and at other
similarly regarded public campuses. To what other institutions the approach can be profitably
extended is an open question.
One further point bears mention here. There has been a negative reaction to using educational
technology for efficiency ends, emerging from various pockets of concerned faculty [22], [23].
Much of this reaction relates to the effect on faculty employment. The capital substitution
argument would seem to suggest a need for fewer faculty. Certainly there is a fear that this will
be the case. Reducing faculty employment is viewed as ‘bad’ in many quarters. It is our view
37
JALN Volume 2, Issue 2 - September 1998
that on the UIUC campus the SCALE Efficiency Projects will have little or no impact on faculty
employment, though we do anticipate a big impact from these projects overall. It is graduate
student employment that will be affected the most dramatically, if the SCALE Efficiency Projects
become more widespread on campus. The reason for this is simple. In the vast majority of the
courses that SCALE has been targeting, graduate students do the bulk of the teaching. The course
coordination function remains in the hands of a faculty member, even with ALN. The upside of
this is to reduce the pressure on graduate student enrollment to staff large introductory
undergraduate courses. This should allow graduate student enrollment to better track the new
Ph.D. job market in the individual discipline and to better match the quality of the particular
degree program. Furthermore, to the extent that the changes in graduate student enrollment can
be made without disenfranchising students who are currently enrolled, simply by adjusting the
size of entering cohorts, it is not obvious that there is a downside to this approach.
II. A SIMPLE PRODUCTIVITY MODEL
All the Efficiency Projects entailed at least some up-front development. This development can be
thought of as part learning -- both attaining a comfort level with the software and formulating a
successful pedagogic strategy -- and part authoring/online publishing. In collecting the data,
development costs were grouped into three categories. First, there are faculty costs (e.g., course
buyouts and summer support). Second, there are programming costs (e.g., hourly wages or
assistantship support of student programmers). Last, there are equipment costs (e.g., the cost of
desktop computers, the pro rata share of server and license costs allocated to the particular
project, and the cost of software).
Subsequent to the up-front development, each Efficiency Project produced some recurrent
benefit. In courses where the overall enrollment remained unchanged, this benefit can be
envisioned as a reduction in operating costs, measured on a semester-by-semester basis.
Operating costs also include three components. First, there is the pro rata share of the faculty
member’s salary (plus benefits) allocated to teaching the ALN course. Second, there is the cost
of other course personnel, either graduate students on assistantship or student hourlies. Last, there
is the pro rata share of common costs, particularly SCALE support staff.
It is helpful to think of the total benefit per semester as the product of two factors: 1) the
reduction in operating cost per student and 2) the number of students in the class. This is the
entire benefit when overall course enrollment remains unchanged. The benefit calculation is a bit
more complicated when ALN allows for an expansion of overall enrollment. Enrollment
expansion can occur only if there had been unsatisfied demand for the course, in which case the
benefit on the cost side itself has two components and there is a benefit on the demand side as
well. These three components are the operating cost reduction on the original class size, the
imputed operating cost reduction on the increase in class size, and the benefit that accrues to
those students who would have been rationed out had course capacity not expanded. We did not
try to measure this third component. We simply note that measurement of the first two
components understates the recurrent benefit.
A goal from the outset was to measure all costs in dollar terms, to best make comparisons
between the various cost components. Thus, with the exception of student hourlies, there was no
attempt made to characterize the time entailed in doing the work, say for the instructor authoring
the on-line materials. The approach we took differs markedly from time and motion studies.
Instead of measuring the time input directly, we measured the dollar amount needed to elicit the
38
JALN Volume 2, Issue 2 - September 1998
requisite time input. In most cases we did this by directly measuring outlays, either in SCALE
grants to the PIs, or in actual salary numbers. In some cases we had to make imputations for
compensation and where we did that we go to some length to describe the calculations. The
most contentious of these imputations was determining the share of faculty salary allocated to
teaching a course. In most cases we simply assigned 25% of the nine-month salary to the course,
because the typical campus-teaching load is two courses per semester in both the fall and the
spring.
That we used actual outlays bears mention for several reasons. First, the Efficiency Projects are
among a group of successful SCALE projects. A few of the original projects did not succeed.
We made no attempt to adjust our development cost measure for the risk of project failure,
because there are offsetting biases in measurement. Second, many of the Efficiency Projects
were among SCALE’s original projects and have received several rounds of funding. The
funding was given out annually and with each grant there was no guarantee of further funding in
a subsequent grant. Thus, the measured outlays might differ substantially from the case where a
multiyear development cycle was planned from the outset. Third, returning to the question raised
in the introduction about whether the results apply to mainstream faculty, there is the issue of
how the size of the SCALE grants compare to right level of compensation for the effort; too
much, too little, or just right. Last, because others who might embark on such an ALN program
now will have the benefit of learning from those who have preceded them, their development
costs will likely not be as great as in the case of the SCALE projects.
Where possible, a computation was made to determine the number of semesters it would take to
recover the up front development cost. This calculation was done twice, once in the case of no
discounting and then again when future benefits were discounted. The effect of discounting is to
lengthen the period of cost recovery. We used a rather conservative interest rate, 9%, so that we
would have reliable bounds on the period of cost recovery [24].
This cost model clearly oversimplifies matters. Authoring does not occur entirely up front.
Teaching with ALN is an iterative process. Modifications are made to the on-line materials based
on the actual experience of teaching a class. The philosophy behind the way these authoring
costs were allocated in this study is as follows. Authoring that takes place during the semester is
treated as operating cost. Authoring that occurred in the summer and for which the PI was
compensated via a grant from SCALE is treated as up front development.
III. CATEGORIZING THE EFFICIENCY PROJECTS
In an attempt to make some relevant cross-project comparisons, we categorize the productivity
increase by whether the scheduled contact hour is with a teaching assistant or a faculty member.
This distinction is relevant for at least two reasons. First, in terms of converting instructor time to
dollars, teaching assistants are paid on a more or less uniform basis. There is much more
variation in faculty members' salary, due to variation in rank and variation in compensation across
disciplines. Moreover, since faculty obligations are a bundle of teaching, research, and service,
with the fraction of the obligation somewhat idiosyncratic, it is hard to parse out the teaching
component. The precision of the cost estimates should be understood in that light. Second, this
distinction, at least on the UIUC campus, represents the extent to which the instructor must adopt
the ALN innovation as terms of employment. Departmental standards determine the graduate
assistant load associated with a 50% time appointment, a 33% time appointment, etc.
Departments can and do change these standards upon occasion, for reasons quite unrelated to
39
JALN Volume 2, Issue 2 - September 1998
adoption of ALN. Moreover, if some courses are taught with ALN and other not, departments
can make the standard flexible to accommodate that distinction. Teaching Assistants with the
same fractional appointment may teach three ALN sections but only two non-ALN sections. The
point is that departments can implement this, as long as in their judgment the burden on the TAs
is roughly equal under either approach and matches the fraction of the appointment. In contrast,
such differential loads cannot be imposed upon faculty members without prior consult and
approval.
When the department that houses the project captures the productivity increase, it occurs either by
the instructor teaching more sections or by the instructor teaching more students per section.
With the first, students should perceive no difference in quality, as from their perspective the
class has not changed. With the second, student perceptions of quality (as well as objective
measures of student performance) provide evidence about the consequences on course quality.
The SCALE project in introductory Chemistry had the TAs teach more sections. Seven of the
other projects achieved the increase in productivity via larger sections. When the instructor
captures the productivity increase, the instructor's workload should drop. The SCALE project in
Microbiology introduced virtual labs to replace wet labs in some cases. During the weeks when
the virtual labs were given, the graduate lab assistants were able to focus their attention on their
research rather than their teaching. Though how the productivity increase is captured is an
important variable, we chose not to use this dimension to differentiate our projects because there
was not enough variation.
All but the project in Mathematics utilized on-line quizzing with automated grading to some
degree. Students receive the benefits of immediate feedback and repeated tries at the material. A
concomitant benefit is that instructors are relieved from the burden of grading. Yet there is more
to a large ALN course than automated grading of assignments. These ALN courses have a
substantial component of people-to-people interaction online. Some of this is peer-to-peer
interaction. The rest is between student and course staff, much of it through scheduled on-line
office hours, which are more extensive than their traditional counterpart. The on-line office hours
are frequently in the evening, when they are convenient for the students. Office hour staffing is
made affordable by having undergraduate peer-tutors, by reducing the number of class contact
hours, or some mixture of the two. Though there were some qualitative differences across
projects in how these office hours were conducted, all of the projects relied on the on-line office
hours extensively. Again, though automated quizzes and on-line office hours are very important
components of our Efficiency Projects, there is not enough variation across projects to focus on
these components as a means to categorize the projects.
For this reason, we searched for another dimension along which the SCALE Efficiency Projects
are differentiated. We ended up focusing on whether all the assignments are machine graded
(short answer), or if there was still some long answer work graded by TAs. In so doing there are
several issues we were trying to address. First, automated grading may seem reasonable to some
instructors in some disciplines, but may appear inappropriate elsewhere. As it turned out, in
several of the projects the two forms of grading were used in mixed mode with the intent of
achieving the best that both have to offer. (Here we are focusing on assignments completed out of
class. One could also look at this distinction with exams. Interestingly, some SCALE supported
courses that have all short answer assignments have long answer exams while other courses with
some long answer assignments have short answer exams.)
Second, in some of the courses
undergraduate peer tutors have also been utilized as graders. This is controversial. On the one
hand, by using undergraduate graders there is an inexpensive supply of grading assistance that the
instructor can tap from those students who have taken the course previously. On the other hand,
some faculty are suspicious that undergraduates do not have the requisite depth of knowledge of
40
JALN Volume 2, Issue 2 - September 1998
the subject to provide good written feedback to the students. Third, it may be that written work
done online and the assessment that goes along with it are simply different from the paper analog.
In SPAN 210 much of the written feedback was actually given by other students as they made
responses to original posts. The graders' job was much more to acknowledge these efforts by the
students and much less to participate in the dialog with additional feedback. In ECON 300, the
rapid transmission of the work coupled with the large number of graders, allowed for rapid
response to the submission. This, in turn, allowed the teams to resubmit homework in response to
the comments of the grader, a practice that is exceedingly rare with paper-based homework.
Productivity Increase/
Grading of Homework
Grad Assistants
All Automated Grading
Some Human Grading
BIO 122 (CyberProf)
A
CHEM 101, 102 (CyberProf)
ECON 102 (Mallard)
SPAN 210 (Mallard) B
Faculty
STAT 100 (Mallard)
C
ECE 110 (Mallard)
CHEM 331(WebCT)
Table 1. Efficiency Matrix
ECON 300 (Mallard) D
MATH 285 (Mathematica)
IV. MEASURING QUALITY OF INSTRUCTION
In the contract made with Frank Mayadas, the efficiency project courses were to be redesigned so
as to improve the quality of learning or to hold the quality of learning constant. Apparent
efficiency gains that resulted in a deterioration of quality were deemed out of bounds.
As we subsequently document, these projects have been successful in lowering expenditure per
student. Elsewhere, expenditure per student is itself regarded as a quality indicator, with greater
expenditure indicating higher quality. (For example, see Money Magazine’s college ratings
[25].) Expenditure per student can be viewed as a measure of input quality. We report on our
efforts to measure the effect of ALN on the output quality in several Efficiency Projects.
To assess changes in course quality due to the use of ALN, in the ideal, we would have the same
instructor teach both an ALN section and a non-ALN section of the course and administer a
common performance standard for both sections. (Jerald Schutte has done a study along these
lines [26]. He finds that the on-line approach significantly outperforms the traditional approach.
But he is unable to control for his own teaching effort across the two sections.) In addition,
students would be randomly assigned to the sections. We could then look at indicators of student
performance and of student satisfaction as measures of output quality. Moreover, we would have
an appropriate benchmark to which we could compare the ALN approach.
As we described in the introduction, we had to make several compromises in implementing our
study. In retrospect, we think an ideal study may be impossible to implement, because of limited
resources and the ethical issues such a study raises. The ideal study can be implemented most
easily when beliefs about the teaching approach are neutral. The more it is believed that ALN is
superior to the traditional approach (or vice versa) the harder it is to implement the study.
Instructors do not want to be shown that their teaching is inferior. Students do not want to take
the version of the course that will make them less prepared to do well on the exams. And
administrators do not want to continue with the traditional approach if ALN appears to afford a
41
JALN Volume 2, Issue 2 - September 1998
productivity advantage. We tried to come as close as possible to the ideal in our investigations of
instructional quality. Whenever possible we attempted to collect quality measures that included
assessments of student performance. We have more plentiful information about student and
instructor attitudes.
V. THE FINDINGS
In this section we proceed through the efficiency matrix presenting brief descriptions of the
projects, a summary of the cost information, and the available output/quality information. Rather
than go through the efficiency matrix in alphabetical order of the cells, we proceed on the basis of
the quality of our evidence.
A. Cell B of the Efficiency Matrix – SPAN 210
In 1996-97, an Italian professor successfully developed an ALN approach for her ITAL 101 and
102 courses. She designed vocabulary/grammar exercises for the students to complete using
Mallard as well as writing assignments done using FirstClass. This professor serves as course
coordinator while graduate TAs teach independent discussion sections.
SPAN 210, a basic course in Spanish grammar, has a similar structure to ITAL 101 and 102. The
idea behind this project was to build on the course development experience of the Italian
professor and to thereby use ALN to begin to address the "Spanish Problem" on campus. At
UIUC, and at most universities nationwide, the demand for Spanish language courses far exceeds
actual enrollment, primarily because the ability to staff these courses is limited. This demand is
fueled by the increasing internationalization of our economy. Students who wish to have a minor
in international studies need competency in a second language. Spanish is the language of
choice. On the UIUC campus, the Spanish problem will be exacerbated by a recently imposed
increase in the foreign language requirement. Though much of the demand for Spanish is in the
introductory courses, SPAN 210 has also had a chronic excess demand problem. There are
students who have wanted to take the course but who have been unable to do so because all the
slots were filled.
To initiate this Efficiency Project, the Italian Professor searched the Spanish faculty for a willing
participant, ultimately enlisting the SPAN 210 course-coordinator, who was drawn into this out of
dissatisfaction with the exercises in the textbook she was using. This search occurred in spring
1997 in response to a call from SCALE administration. At the outset of the ITAL 101-102
project, in summer 1996, it was not envisioned that it would lead to a subsequent SPAN 210
project. For this reason, we are not including the development cost of the Italian project in the
cost calculation for SPAN 210. In summer 1997 the Spanish professor and a graduate assistant
began developing on-line materials for SPAN 210. In fall 1997 two out of nine regular sections
of Span 210 were taught with ALN (utilizing both FirstClass and Mallard). Each ALN section
was twice as large as a traditional section. The ALN section met only once a week while the
traditional section met 3 times a week. This helped to keep the workload for the instructors
uniform across sections. The professor also used ALN in a Discovery section. The campus
Discovery program includes a set of courses that are for freshmen only, that are taught by tenured
or tenure-track faculty, and that have class size capped at 20 students. All ALN and non-ALN
sections used similar exams.
In fall 1997 the use of ALN allowed the department to increase class size from 19 students to
approximately 38 students in each of two sections. The department believed that by using ALN
42
JALN Volume 2, Issue 2 - September 1998
to teach all sections of SPAN 210 in the future they would be able to teach approximately twice
as many students, without adding personnel. In fact, in spring 1998, all sections of SPAN 210
were taught with ALN and all have experienced a doubling of enrollment relative to historical
norms.
It is important to observe that the development period for ALN in Span 210 was only summer and
fall 1997, where the Spanish professor authored the on-line material for Mallard. This shorter
development cycle can be attributed, in large part, to the Italian professor’s prior development.
As it turns out, however, the Spanish professor did not merely translate the questions that had
been written in Italian, but rather wrote her own questions that suited her view of how to teach the
subject. This wasn’t planned for at the beginning of summer 1997. It turned out that way
because the Spanish professor developed an increasing enthusiasm for the enterprise.
Nonetheless, we expect to see shorter development cycles in projects that are derivatives of
earlier development. If these derivative projects are aimed at efficiency ends, they are likely to
produce a dividend quickly.
That is the case for the SPAN 210 project. Total development cost (measured by the size of the
SCALE grant) was $15,336, divided roughly evenly between faculty summer support and student
programming support. Even with discounting, it is clear the first full-scale ALN offering of this
course in spring 1998 produced a cost saving that more than covered this development cost. This
project has already produce a dividend. It has also paved the way for a further, more ambitious
Efficiency Project in the introductory Spanish sequence.
The SPAN 210 course is the closest we came to conducting the ideal study of output quality. The
two ALN sections were compared to two non-ALN sections used as a control group. These four
sections had common exams. We have the results from the two midterms. The ALN sections
had approximately twice the number of students as the traditional sections, so in comparing the
distributions one should focus on the cumulative distribution functions, not on the absolute
number within each category. There was also a common attitudinal survey administered and a
focus group for each section.
Table 2 shows the results for Midterm 1 and Table 3, the results for Midterm 2 for the ALN and
non-ALN sections of Spanish 210. Table 2 shows that the non-ALN section had more students at
the extremes of the distribution. This implies the two distributions cannot be ranked via first
order stochastic dominance. The non-ALN section had a slightly higher median, in the 91 – 93
range. The ALN section had a median in the 87 – 90 range. Table 3 shows the reverse. The
ALN section had more students at the extremes. There were some drops in both sections, more
percentage-wise in the non-ALN section. This explains, perhaps, the result at the lower extreme
of the distribution. The medians were the same for Midterm 2, in the 87 – 90 range.
43
JALN Volume 2, Issue 2 - September 1998
Midterm 1
ALN
non-ALN
n
%
Cumulative
n
%
Cumulative
97 - 100
2
2.56
2.56
3
7.50
7.50
94 - 96
16
20.51
23.08
6
15.00
22.50
91 - 93
12
15.38
38.46
11
27.50
50.00
87 - 90
10
12.82
51.28
10
25.00
75.00
84 - 86
9
11.54
62.82
4
10.00
85.00
81 - 83
8
10.26
73.08
1
2.50
87.50
77 - 80
4
5.13
78.21
2
5.00
92.50
74 - 76
7
8.97
87.18
0
0.00
92.50
71 - 73
5
6.41
93.59
0
0.00
92.50
67 - 70
3
3.85
97.44
2
5.00
97.50
64 - 66
2
2.56
100.00
0
0.00
97.50
61 - 63
0
0.00
100.00
0
0.00
97.50
60 & below
0
0.00
100.00
1
2.50
100.00
N=78
N=40
Table 2. Comparison of ALN and non-ALN Midterm 1 Results in Spanish 210
Midterm 2
ALN
%
Cumulative
10.67
10.67
14.67
25.33
16.00
41.33
17.33
58.67
12.00
70.67
13.33
84.00
8.00
92.00
0.00
92.00
2.67
94.67
1.33
96.00
1.33
97.33
1.33
98.67
1.33
100.00
non-ALN
%
Cumulative
5.71
5.71
20.00
25.71
8.57
34.29
22.86
57.14
20.00
77.14
5.71
82.86
8.57
91.43
2.86
94.29
2.86
97.14
2.86
100.00
0.00
100.00
0.00
100.00
0.00
100.00
n
n
8
2
11
7
12
3
13
8
9
7
10
2
6
3
0
1
2
1
1
1
1
0
1
0
1
0
N=75
N=35
Table 3. Comparison of ALN and non-ALN Midterm 2 Results in Spanish 210
97 - 100
94 - 96
91 - 93
87 - 90
84 - 86
81 - 83
77 - 80
74 - 76
71 - 73
67 - 70
64 - 66
61 - 63
60 & below
We also perform a comparison of means, under the assumption that all observations occur at the
midpoint of the cell. For example, all observations in the 97 – 100 range are treated as occurring
at 98.5. Letting x denote the ALN outcome and y the non ALN outcome, we report values of the
statistic
zm = (xm – ym)/(sx2/Nx + sy2/Ny)1/2
44
JALN Volume 2, Issue 2 - September 1998
Under the null hypothesis that there is no difference in the means for the two classes, zm should
have a standard normal distribution. For Midterm 1, we see that the non-ALN sections did score
significantly higher than the ALN sections at the 90% confidence level, but not at the 95%
confidence level. For midterm 2, there is no significant difference in the scores of the two
sections, even at the 99% confidence level.
ALN Sections
non-ALN Sections
mean
var
N
mean
var
N
zm
Midterm
85.4
77.76
78
88.43
66.98
40
-1.85
Midterm 87.22
73.61
75
87.31
51.66
35
-0.06
Table 4. Comparison of means between ALN and non-ALN Midterm Scores in SPAN 210
Based on the Midterm 2 performance, it appears that doubling the class size and meeting with the
TA only 1 day a week (instead of 3 days a week in the non-ALN sections) has not had an adverse
affect on student performance.
The attitude surveys indicate that students in the ALN sections had significantly less contact with
their peers and with their instructor than students in the traditional section. Interestingly, the
students in the ALN sections indicated they had significantly greater access to their instructors.
Thus, the reduced interaction with the instructor in the ALN sections must be, in part, a matter of
choice by the students.
These findings are explained by the following. The traditional sections met for 3 hours a week.
The ALN sections met one hour a week. During the other two scheduled hours, the instructor
held office hours. (There were also office hours at other times.) This meant the students had no
other obligations during at least some of the scheduled office hours and therefore should have
been expected to report they had good access to the instructor. Office hours were voluntary.
Many students did not avail themselves of this contact opportunity. The focus group discussions
indicate that for a grammar course with the exercises in Mallard, many students did not perceive
the need to discuss the material with the instructor. This is why there was less contact. The
students perceived the course to be self-paced (though it did have the one weekly class session).
Apparently, they did the work on their own, rather than in groups. In general, we would like to
see a lot of student-to-student contact in an ALN course. But for this type of material, that
contact may be unnecessary.
There were three summative questions posed in the survey:
1. How difficult was the material?
2. Would you recommend the course to a friend?
3. How much did you learn?
The responses to questions (1.) and (2.) did not significantly vary from the ALN to the non-ALN
sections. Students found the material moderately easy and more would recommend the course
than would not. On question (3.), the ALN students reported learning less than did the non-ALN
students. This is somewhat surprising in light of the exam results. From the responses to this
question and to the question about Mallard in the focus group, it appears that about 75% of the
class liked the Mallard approach and thought they got something worthwhile out of it. But some
of the students thought the Mallard exercises dull and would have preferred more human
interaction. We suspect it is those students who reported not learning much in the course. It
would be very interesting to know 1) whether these students had problems due to computer
45
JALN Volume 2, Issue 2 - September 1998
literacy and 2) how these students actually did on the exams. The surveys were anonymous so we
do not have this information. It does raise the question, however, whether the students might be
learning, at least as measured by the exams, without their perceiving it.
B. Cell C of the Efficiency Matrix – STAT 100, ECE 100, and CHEM 331
STAT 100
STAT 100 is a course that fills the UIUC’s relatively new General Education quantitative
requirement. For most of these students it is the only quantitative course to be taken in college.
Several years ago the experienced ALN instructor developed a software package for the course,
WinPop, designed to be easy to use by the students (because many of these students were not
very computer savvy) and to demonstrate fundamental statistical concepts in a highly visual
manner. When SCALE came along, this instructor introduced FirstClass conferencing into the
course. (He has since switched to WebBoard.) He also converted his software to Java, so it is
visible in Netscape. He has recently started to use Mallard for administering on-line quizzes.
By using ALN, this experienced instructor has increased the number of students he teaches in his
section of STAT 100. As shown in Table 4, enrollments in STAT 100 continue to increase. The
experienced ALN instructor is working with other professors who teach the course to get them to
use ALN. The hope is that all professors who teach STAT 100 can raise their enrollment levels
by using ALN and subsequently accommodate the expanding enrollments without increasing
instructional delivery costs.
Semester
Number of Students
Fall 1995
360
Spring 1996
351
Fall 1996
430
Spring 1997
426
Fall 1997
454
Table 5. Overall Enrollment in STAT 100 by Semester
Table 6 presents information on section-by-section enrollments in STAT 100. (Note that
Discovery sections of STAT 100 are not included in the calculations in Table 6, but they do count
in the overall enrollments in Table 4.) The ALN section that was taught by the experienced ALN
instructor has been the largest section of STAT 100 since fall 1995. For the three semesters, fall
1995 – fall 1996, the ALN section had about 25 more students than the traditional sections.
Overall demand ratcheted upwards in fall 1996. This was initially met by adding a traditional
section. By spring 1997 it was apparent that this demand increase was permanent. Moreover, the
department owed the campus a Discovery section of the course that it had not taught the previous
spring. Thus, it had to accommodate over 420 students with only four sections. As an
experiment, it was decided to have the entire increased capacity be borne by the ALN section.
This experiment proved successful and enrollments in this ALN section remained high in fall
1997. Moreover, some of the other instructors began to experiment with ALN. To keep this
experimentation from being too onerous, section size was reduced. This explains the increase in
section size in the traditional sections that semester. The long-term plan is to have all sections be
ALN and to have only four non-Discovery sections.
46
JALN Volume 2, Issue 2 - September 1998
ALN Section(s)
Non-ALN Sections
Average Number of Students
Average Number of Students
104 (1 Section)
77 (3 Sections)
110 (1 Section)
80 (3 Sections)
101 (1 Section)
76 (4 Sections)
187 (1 Section)
80 (3 Sections)
154 (1 Section)
101 (2 Sections)
49 (2 Sections)
Table 6. Average Enrollment Per Section in STAT 100 by Semester
Semester
Fall 1995
Spring 1996
Fall 1996
Spring 1997
Fall '97
The large ALN section uses an undergraduate peer tutor at an estimated cost of $1,000 per
semester, paid out of funds supplied by SCALE. This constitutes the only difference in operating
costs between the big ALN section all other non-Discovery sections. The department provides
approximately the same amount of money to be spent on faculty, TA, and grader support in all
sections. Thus, it is clear that the big ALN section is more cost-effective than the other sections.
We have a modest amount of comparative exam results from the fall 1997 semester. These come
from common questions on one midterm. Four of the sections were involved in making these
cross section comparisons. These were ALN1, the largest section and taught by the instructor
who developed the materials; ALN2, the smallest of the four and taught by an instructor was
using ALN for the first time, adopting the approach used in the largest section; and two other
sections, non-ALN1 and non-ALN2, that used the traditional approach.
ALN1 and non-ALN1 had questions 1 – 4 in common. ALN1, ALN2, and non-ALN2 had
questions 5 – 7 in common. We did a comparison of means, on a question by question basis.
Table 7 shows that ALN1’s students out performed the others. They had the highest mean scores
on each question, significantly higher than non-ALN1’s students on questions 1 and 3,
significantly higher than non-ALN2’s students on questions 5 – 7, and significantly higher than
ALN2’s students on questions 5 and 7 at the 90% confidence level. The latter suggests that we
cannot be sure whether the results are attributable to ALN or instead to instructor-specific effects.
Nevertheless, the results should make one optimistic about ALN. It seems that in this class ALN
is boosting student exam performance. This is all the more impressive considering that ALN1 is
larger and has a lower cost per student.
Regarding student attitudes, we have survey data only from the large ALN section. Thus we have
no comparative information on student attitudes.
The professor used Mallard for on-line quizzes. He used WebBoard for computer conferencing
and has developed quite a lot of Web-based, highly graphical statistical material, to illustrate
basic principles. In this use of the virtual environment, STAT 100 was quite similar to ECE 110.
Moreover, the summative questions in the survey for STAT 100 were identical to those for ECE
110. Interestingly, the student responses were quite similar to those in the engineering course.
Over 90% said they found using the Web easy or somewhat easy. Eighty-five percent rated their
overall experience as good or better. And not quite 90% said they would probably or definitely
take another course that used the Web.
It must be emphasized that the STAT 100 students are non-technical (in contrast to the ECE 110
students). Consequently, it might be a reasonable inference that it is the teaching approach
coming through in the responses to the summative questions.
47
JALN Volume 2, Issue 2 - September 1998
Question 1
Question 2
Question 3
Question 4
n
134
118
123
108
N=146
Question 5
Question 6
Question 7
n
142
142
119
N=146
ALN1
%
0.918
0.808
0.842
0.74
ALN1
%
0.973
0.973
0.815
var
0.075
0.155
0.133
0.193
var
0.027
0.027
0.151
n
69
65
54
57
N=88
n
42
50
41
N=54
non-ALN1
%
var
0.784
0.169
0.739
0.193
0.614
0.237
0.648
0.228
ALN2
%
0.778
0.926
0.759
var
0.173
0.069
0.183
zm
2.707
1.22
3.812
1.471
non-ALN2
N
%
44 0.454
72 0.742
64 0.66
N=97
var
0.248
0.191
0.224
ALN1 vs. ALN2
ALN1 vs. non-ALN2
ALN2 vs. non-ALN2
zm
zm
zm
Question 5
3.349
189.6
4.273
Question 6
1.225
106.9
3.225
Question 7
0.84
46.4
1.318
Table 7. Comparison of Student Performance on Common Exam Items Administered in
ALN and non-ALN Sections of STAT 100
ECE 110
Because of rapid changes in the field, the Department of Electrical and Computer Engineering
has deemed it necessary that students get more hands-on experience in the laboratory beginning
their freshman year. To achieve this end, the department has decided to move circuit analysis
from the second year to the first year of instruction. The emphasis on hands-on instruction
requires some de-emphasis on theory. The new course, ECE 110, takes a more basic approach to
the theory than the old course, ECE 270. The PI is the main instructor in the lecture component
of ECE 110. He has developed extensive materials for delivery in Mallard. All the homework is
on-line and is automatically graded.
The PI also makes extensive use of
newsgroups/conferencing. In this way he can easily keep up with problems that students may be
having with the material.
ECE 110 has been taught with ALN from the outset. Thus there is no basis for comparison with a
traditional version of the course. The lecture part of ECE 110 is less labor intensive than the
analogous part of ECE 270. Indeed, with ECE 270 as the base, all development costs, about
$65,000, had been recovered by the end of fall 1997. This should be interpreted cautiously,
however. While some of the cost savings are undeniably due to ALN, some of the savings must
be attributed to course restructuring.
We have absolute attitudinal information from the course survey. Because a newsgroup was used
in addition to Mallard, the survey refers to Web use rather than Mallard use. There were three
summative questions in the survey.
1. How easy did you find using the Web for purposes of this course?
48
JALN Volume 2, Issue 2 - September 1998
2. How would you rate your overall experience using the Web in this course?
3. Would you take another course using the Web?
Over 90% of the class reported that using the Web was easy or somewhat easy. Eighty percent of
the class rated it as good, very good, or excellent. And Eighty-eight percent of the class reported
they would probably or definitely take another class with the Web.
The survey results indicate that the bulk of the students are happy with the way the course is
delivered. We cannot tell, however, if that is due to student characteristics, the vast majority
being electrical engineering students, or if instead it is due to characteristics of course design. For
that reason, the comparison with STAT 100 is helpful.
CHEM 331
CHEM 331 is the organic chemistry field course intended for students in the Life Sciences. The
project is actually an outgrowth of the PI’s (and the students') displeasure with the use of lectures
in the course. The instructor believed she could greatly improve the course by teaching it entirely
on-line. The absence of face-to-face contact constitutes a more radical experiment than SCALE’s
other projects. There are several potential efficiencies if the project is successful, including larger
enrollments, lack of classroom-facility requirements, and additional revenue from extramural
student tuition rates.
The movement to a totally on-line format was more radical than was taken in the other SCALE
courses. The fall 1997 semester was the first offering of the course in this mode. The professor
advertised that the course would be taught this way. Nonetheless, she reported that a full Twentyfive percent of the students who signed up for the course did not have the requisite computer
literacy. Either the advertising failed or the students ignored it, perhaps because there is only one
section of CHEM 331 per semester. It is fair to say that some of the negative responses to the
summative questions on the CHEM 331 survey are due to these problems with computer
background, rather than with the teaching approach itself.
The three summative questions in this survey were
1. Compared to traditional (i.e., non-online) courses, how much did you learn
in this course?
2. How would you rate the overall quality of this course?
3. Would you recommend this course to your friends?
The results show that 47% thought there was either no difference or more learning in this ALN
course than in the traditional course. 38% thought the quality was good or better. And 38%
would probably or definitely recommend the course to a friend.
Section Analysis
A comparison of the STAT 100 and the ECE 110 courses suggest that we are seeing the success
of the ALN approach itself. It is unlikely that the high marks on the summative questions in the
survey could be explained by a matching of student characteristics to the particular teaching style,
given how disparate the audiences for the two classes are.
The more mixed responses to the summative questions for CHEM 331 suggest either that some
students perceive a benefit to lecture that others, including the instructor, may not acknowledge,
or that there needs to be much more help for the students at the outset when an on-campus course
is taught totally online.
49
JALN Volume 2, Issue 2 - September 1998
C. Cell D of the Efficiency Matrix – ECON 300 and MATH 285
ECON 300
Among the original SCALE projects, ECON 300 is the only one that from the outset was
designed with the goal of achieving cost savings in instruction. After some trial experimentation
with ALN, the instructor has gone from teaching a traditional course of 60 to teaching 180
students with ALN. The traditional class size is explained as follows. In the Economics
department as elsewhere on campus, it is believed that making eye contact with all students in
lecture is critical to being able to offer a high quality course. Countering this, overall enrollments
for ECON 300 are around 700 and the department is pressed to find enough instructors to staff
the course. The number 60 represents the high end of a balance between these competing needs.
The department has had access to amphitheater classroom seating for teaching ECON 300 and
that creates an ability to teach relatively large numbers in an intimate setting. Absent this
capacity, section size in ECON 300 might have been smaller. Indeed, not all sections use the
amphitheater classrooms and these other sections do tend to be smaller.
The large ALN section represents an abandoning of the eye-contact model and a general deemphasis of the lecture in favor of on-line activities, with the aim that the overall quality of the
course would improve. The pedagogic strategy behind the ALN approach has several
components. There is a self-teaching component that utilizes Mallard for on-line quizzes. There
are also written problem sets done online in FirstClass. The problem sets are assessed on a team
basis. An individual team member submits a proposed solution to a particular homework
problem on behalf of the team. The submission is graded rapidly, within 48 hours, and returned
online to a team conference. As long as this occurs before the deadline for the problem set,
another team member can resubmit the problem, taking account of the grader’s comments. The
rapid turnaround time is facilitated by use of undergraduate graders. These same peer tutors also
provide office hour help, both face to face and online during the evening. The large class size
justifies having many of these peer tutors. An absolute grading scale has been imposed to
encourage the students within a team to collaborate. More than half the credit for the course is
based on the homework. This is approximately equally divided between the self-teaching work
done in Mallard and the group work done in FirstClass.
In fall 1997 (and spring 1998) there was another instructor who had his own ALN section of 60
students. This instructor was ‘apprenticing’with the PI and used the previously developed course
materials and same on-line pedagogy. The undergraduate TAs for both sections provided
common office hours, both face to face and online. The TAs grading time, however, was devoted
only to the section to which the TA was assigned. In the cost part of the study on ECON 300, the
entire focus is on the original PI. The apprentice instructor is ignored. In the output/quality part
of the study, extensive comparisons are made between the two sections, both in student
performance and in student attitude. Here we are not measuring how ALN compares to the
traditional approach, but rather whether ALN pedagogy can translate well from the creator to
another instructor.
In Table 8, we present some summary results on the cost estimates for the ECON 300 project. In
spring 1996 and again in spring 1997, the course was taught with 150 students. (The ALN version
was not offered in fall 1996.) Then in fall 1997, the enrollments in the ALN section were allowed
to increase to 180. Since our most precise operating cost information is for fall 1997, we have
calculated the numbers in Table 8 using fall 1997 data. The numbers in the first column of Table
8 are based on a pro rata share of the fall 1997 costs rather than on the historical data. If
anything, this understates the cost savings, as the peer tutor/student ratio increased from spring
1996 to fall 1997. That there are four rows to the table reflect the issue of how to allocate faculty
50
JALN Volume 2, Issue 2 - September 1998
salary to teaching. Rows one and two reflect a low fraction, one ninth of the annual nine-month
salary. With a four course annual load, this leaves more than half the faculty time to be allocated
for research and service and in that allocation these alternative uses of faculty time are viewed as
orthogonal rather than as complementary. Rows three and four are based on a higher fraction of
faculty time, one quarter of the annual salary. This makes sense if there is a strong
complementarity between all uses of faculty time and hence it is appropriate to allocate it all to
teaching. Another issue reflected in the table is workload. Going from teaching a traditional
course of 60 students to teaching an ALN course of 180 increased the workload of the faculty
member. In general, to encourage professors to teach larger ALN sections, these sections may
have to be given more weight in the calculation of faculty workload. (For ECON 300, the PI
believed a multiple of 1.5X would be a fair estimate.) The increased compensation for the large
ALN section is reflected in rows two and four of the table. It should be noted that this represents
a hypothetical case. No actual compensation of this type was paid.
The ECON 300 project was funded in each year of the original grant. Funding was greatest
during the first year, when there was both a course buyout for the PI as well as summer support.
Total development costs are estimated at $56,224, with around 75% of that figure counting as
compensation for the PI. In Table 8 we provide an estimate of the period of cost recovery under
each scenario of faculty compensation. To get some reasonable bounds, we do this twice. First,
we do this without discounting the future cost savings. These estimates are given in column three
of the table. Then we repeat this with discounting, using the 9% as the annual interest rate. As
should be obvious from the table, the period of cost recovery is sensitive to how faculty salary is
imputed (and to the discount factor). Under the most optimistic scenario, the large ALN section
of ECON 300 was already producing a hefty surplus by the end of fall 1997. Under the most
pessimistic scenario, we still have to wait a year to recoup all the development costs.
Savings per
student (150 )
$83
Savings per
student (180)
$100
In the black
by ( i = 0%)
Comp = 1/9 salary
After spring
(ALN and non-ALN)
1998
Comp = 3/18 salary (ALN)
$55
$71 After spring
Comp = 1/9 salary (non-ALN)
1999
$181
$209 After fall
Comp = 1/4 salary
(ALN and non-ALN)
1997
Comp = 3/8 salary (ALN)
$122
$154 After fall
1997
Comp = 1/4 salary (non-ALN)
Table 8 – Per Student Cost Savings and Length of Recovery Period
In the black
by ( i = %)
After fall
1998
After fall
1999
After fall
1997
After fall
1997
We do not have any ALN versus non-ALN information for ECON 300. We do have substantial
information, however, that compares the two ALN sections. We have exam results for common
question on two midterms and a final. (Note that on the final there were quite a few students who
took a conflict exam. The scores of those students are not included in the sample.) In this case,
we are providing evidence on whether the ALN approach transfers readily from the developer of
the materials to another instructor. Again we perform a comparison of means. 0n each of the
midterms there were 5 common questions on the exams (4 points a piece). On the final, there
were 14 common questions.
51
JALN Volume 2, Issue 2 - September 1998
Instructor Experienced with
Instructor New to ALN
ALN
mean
var
N
mean
var
N
Midterm 1
11.95
22.75
170
13.10
22.37
58
Midterm 2
12.53
17.75
166
12.07
29.99
57
Final
30.40
42.79
155
30.37
31.14
49
Table 9 - Comparison of Means on Common Exam Items Administered in
ALN Sections of ECON 300
zm
-1.60
0.58
0.02
On Midterm 1, the inexperienced instructor’s students actually outperformed the experienced
instructor’s students, though the results are not significant at the 90% level. On the second
midterm and the final, there is essentially no difference in the performance of the students in the
two different sections. This evidence suggests that the ALN approach did transfer, at least as far
as its impact on student performance.
We also have the results of surveys administered to both sections and group interviews performed
by the evaluation team. In the surveys, there were no significant differences between the
responses of the students in the two sections on the vast majority of the questions. The
experienced instructor’s students did find Mallard somewhat easier than did the students in the
other section. But that was the exception. On the following summative questions, for instance,
there was no significant difference in the responses of the students.
How did the use of Mallard and FirstClass in this course affect:
1. The amount of your learning?
2. The amount of your motivation to learn?
3. Your familiarity with computers?
Yet in spite of this, the experienced instructor’s students rated the overall experience significantly
higher. This could either be because the ALN approach didn’t transfer completely or because
there are instructor-specific effects coming through in the overall rating number.
The group interviews produced a good contrast between the two sections on the efficiency-ofinstruction issue.
The interviewer asked:
One thought is that the use of a program such as Mallard will allow classes to have more
students yet maintain the same quality level. For example, one possibility in a class like
this is that if the instructor has to grade less homework they will have the necessary extra
time to interact with more students. What do you think?
In the large section (180 enrolled) taught by the experienced instructor some responses
were:
• This would be okay. I think I get a lot of contact with the material because of the
Mallard homework. The size of this class is okay.
52
JALN Volume 2, Issue 2 - September 1998
• There were only minor problems using ALN; whether or not a course is high quality
depends more on the accessibility of TAs and the quality of their help. So, I think it
would be okay. TAs just have to be accessible on-line often.
In the normal size section (60 enrolled) taught by the instructor new to ALN about half of the
students indicated increased enrollment would be a problem.
• With higher enrollment, students will be more inhibited to ask questions in class.
• Higher enrollment will limit students access to instructor/TA.
The stark difference in responses here points to an issue that must be overcome when
attempting to attain efficiency in instruction. Efforts must be made to counteract the fear
that students have that large classes will be impersonal and ineffective as learning
environments. It seems critical that students be offered a communication channel, perhaps
multiple communication channels. That does not mean, however, that it must be the
instructor who staffs these channels.
MATH 285
The PI has been involved in teaching Calculus with Mathematica (C&M) for over a decade.
There is now quite a range of math courses taught with the C&M approach on our campus. This
approach fully embodies the "guide on the side" model of instruction. Students explore on their
own through carefully crafted exercises in Mathematica. There is no lecture and pairs of students
typically do homework in tandem. Once a week, after the exploration and right before the
homework is due, there is a casual, instructor-led discussion to review what has been learned
from the students' explorations and to emphasize the main points of the exercises. During the
week, the instructor and undergraduate TAs are available for consultation.
Over the past few years, C&M has expanded its operation to include a distance component,
NetMath. This was targeted primarily at high school students in rural areas who would not have
access to an AP Math course. There are now also adult learners and students from urban high
schools. The distance students work the same exercises as the on-campus students. They have
access to on-line help through the telephone and more recently through Netscape Conference,
application sharing using Timbuktu, and asynchronous communication via e-mail.
Undergraduates staff this on-line help.
There are two components to the current project. First, these distance-learning help mechanisms
have been placed within the on-campus MATH 285 course, turning the on-campus course into a
true ALN course. This makes the course more convenient for the students (e.g., they would no
longer need the lab for completing their work) while allowing it to accommodate more students.
Enrollment in a C&M MATH 285 section has been artificially constrained by the size of the lab.
Second, self-pacing will be introduced and students will have a longer time to complete the
course. This too will be a convenience for the students, particularly the computer science
students who constitute the bulk of the audience for this class. The self-pacing should boost the
demand for the course substantially. The increased demand will necessitate more graders and online help. But the incremental cost is low and the cost per student should drop. The burden on
the instructor is only the writing and evaluating of the exams. This can be done on a contractual
basis, in an overload capacity.
In the fall 1997 semester, when this approach was being tried out for the first time, the instructor
got course credit for teaching the course and enrollments were only half that of the other C&M
53
JALN Volume 2, Issue 2 - September 1998
sections. As a consequence, cost per student rose. In the spring 1998 semester, however, the
course was offered with higher enrollments and the instructor was assigned to the course in an
overload capacity. This lowered the cost per student below that of all the other formats in which
MATH 285 is taught.
We did a survey for both the ALN section and a traditional section late in the fall 1997 semester.
The survey had three parts. The first part had common questions. The second part had questions
that were intended only for the traditional section. The third part had questions that were
intended only for the ALN section. Here, we discuss only the first part and with that only where
there were significant differences across sections or where the students made summative
judgements. As it turns out, there were significant differences on many of these questions.
Students reported greater access to materials in the ALN section. They also reported much
greater interaction with their peers and consequently more learning from their peers. In
particular, 75% of them reported working with their peers several times a week to complete their
assignments, while half the traditional students reported no peer-to-peer interaction of this sort.
The ALN students found the instructor less accessible, by a wide margin.
On the summative questions, there was no significant difference reported in the amount of
learning across sections. The ALN students did report their course was difficult; the non-ALN
students reported less difficulty with the material. Nonetheless, the ALN students were more
likely to recommend the course to a friend.
Our reading of these results is that the ALN course is more intensive than the non-ALN course
and that the students by and large like that intensity. But they supply much of it on their own
with the help of their peers. The instructor was keenly involved in the course design but is less
involved interacting with the students.
Section Analysis
If instructor productivity is to increase and homework assignments are not short answer, laborfor-labor substitution must be part of the explanation. Both the ECON 300 PI and the MATH 285
PI are staunch advocates of using undergraduates for administering many course functions. The
fact that these students have already taken the course means there is little additional training
required. Moreover, that the instructor handpicks these students implies everyone is a willing
participant in the teaching. That is not always the case with graduate assistants. Some added
benefits are that the instructor naturally develops a mentoring relationship with these
undergraduate peer-tutors, such mentoring relationships are rare on the UIUC campus, and these
students, because they have positions of responsibility, develop on-line communication skills that
have value in the job market. At least for these courses and these instructors, competency of
assessment by the peer tutors was a not an issue.
D. Cell A of the Efficiency Matrix – BIO 122, CHEM 102 & 102, ECON 102
BIO 122
The BIO 122 course was one of the original SCALE projects. There have been two years of prior
development, mostly producing animations of microbial interactions for the Web and quiz
content for self-assessment authored in CyberProf. The current project is an outgrowth of two
related factors. First, the School of Life Sciences is being split into two separate schools − more
or less on the lines of microbiology in one school and macrobiology in the other school. Second,
there has been a growing reform movement in the teaching of biology. One precept of the reform
movement is to alternate the sequence in which material is presented in the biology curriculum.
Traditionally, labs have been used both to perform experiments and to teach the students how to
54
JALN Volume 2, Issue 2 - September 1998
analyze the data that accrues from them. Under the reform view, teaching data analysis should
occur prior to performing experiments. The idea behind this project is that data analysis need not
be taught in a lab, but rather on-line in a "virtual lab." The conversion from wet labs to virtual
labs has the potential to produce substantial savings, not just in BIO 122 but elsewhere as well.
Moreover, there are other large introductory courses in the sciences that have no lab component
whatsoever, because it would be too costly to provide the wet-lab experience.. Successful use of
virtual labs in a high enrollment course in one discipline could very well pave the way toward use
of virtual labs in introductory courses in other disciplines that could benefit from this type of
enrichment.
BIO 122 is a very high enrollment course with about 1500 students a year; the vast majority of
them being pre-med students. The PI co-teaches the course with another professor. She lectures
for the first half of the course and he lectures for the second half. There are weekly labs. The
virtual labs are quite intensive to author. For the implementation in fall 1997, there were about an
equal number of virtual labs and wet labs over the part of the course that the PI taught. More of
these labs virtual labs are being designed at present.
The potential for productivity increase is quite large. The labs are extremely labor-intensive. The
introduction of the virtual labs economizes on the number of lab assistants. There is also an
obvious economy in lab materials. By design, some of the economy in personnel is captured by
the graduate assistants. During the weeks with the virtual labs, they get a break from their
assistantship work and can spend that time on their thesis or pre-thesis research. This suggests
that one long-term measure of the presence of the efficiency gains is the time to dissertation
completion. Since graduate students in microbiology do a lot of teaching, if the virtual lab
productivity increases are real, the time to dissertation completion in microbiology should go
down.
The survey information we have for BIO 122 concerns the course in general, rather than the labs
in particular. The three summative questions in the survey were
1. How easy did you find using the Web for purposes of this course?
2. How would you rate your overall experience using the Web in this course?
3. Would you take another course using the Web?
Over 90% of the students reported that using the Web for the course was somewhat easy or easy
to use. Over 85% of the students reported that their experience with the course was good or
better. And over 85% of the students reported that they either probably or definitely would take
another course using the Web.
The ALN component of the course is quite popular, as the scores indicate. The results here look
similar to the results from STAT 100 and ECE 110.
CHEM 101 & 102
This project entails the introductory general chemistry sequence, CHEM 101 and 102. The
enrollment for these courses is very large, over 2000 students a year. CyberProf has been
adopted as a means for doing on-line homework and as a communication device. While lectures
remain the same, weekly quizzes are no longer administered during recitation sections. Instead,
the full period is devoted to answering student questions about the homework and to working
other problems. Since all the homework is now automatically graded, the TA burden has been
55
JALN Volume 2, Issue 2 - September 1998
substantially reduced. Thus, TAs are now assigned three recitation sections (historically the load
has been two sections). More TA help was offered out of the recitation section, both in face-toface hours and on-line. In spite of this increased out-of-class help, there was a substantial saving
in personnel costs from having some of the TAs take on three sections instead of the usual load of
two sections.
In fall 1997 for the first time, Chemistry ran CyberProf off their own servers. They had lots of
problems with server configuration and didn’t fully resolve those problems till after mid semester.
In essence, they had to use a makeshift system for the first half of the semester. The students’
responses to the summative questions should be interpreted in that light.
The three summative questions in this survey are:
1. How easy did you find using CyberProf for purpose of this course?
2. How would you rate your overall experience using CyberProf for this
course?
3. Would you take another course using CyberProf?
Seventy percent of the respondents reported CyberProf somewhat easy or easy to use. 56% of the
students rated their experience good or better. And 56% of the students probably or definitely
would take another course using CyberProf. Given the server problems, these marks are certainly
encouraging.
ECON 102
ECON 102 is one of the highest enrollment classes on campus. Enrollments per year are roughly
3500. The PI has taught ECON 102 lectures with as many as 1400 students. However, due to
recent campus restrictions on maximum class size, the enrollment in an ECON 102 lecture has
been capped at 750 students. The PI's class meets twice a week ensemble and once a week in
recitation sections led by TAs. Last spring, he used Mallard for on-line quizzes and a Web
conferencing package written by his former undergraduate programmer. Rather than author the
Mallard questions himself, the PI secured a database of questions from the publisher of the
textbook he uses and had his assistant put the material in Mallard.
The PI authored on-line material (presentation with voice annotation) that would be used by the
students in lieu of attending the recitation section. Ultimately, the voice component was dropped
in this round because of difficulties in the logistics of implementation. The TAs who had taught
these recitation sections were able to offer more help out of class, both face to face and on-line.
The PI estimates that he can double the student-to-TA ratio this way and thus require fewer TAs
in total. This is consistent with the Economics Department goal to downsize the Ph.D. program.
As an added benefit, the reduction in the number of TAs will eliminate some of the variation in
individual TA presentations across sections.
In spring 1998, the PI used ALN in a few experimental sections, entailing about 200 students. If
this proves successful, he plans to use it in all sections of ECON 102 the next time he is slated to
teach it, spring 1999.
ECON 102 is also one of the original SCALE projects. In each of the three times it has been
offered, there have been ALN and non-ALN sections, with common exams. Kristine Burnaska is
doing a separate, intensive study, [27], that looks at the various determinants of student
performance in ECON 102 and, in particular, compares the performance of students in the ALN
56
JALN Volume 2, Issue 2 - September 1998
sections to those in the non-ALN sections. Because the ALN version of ECON 102 is taught only
in the spring, we do not have survey or interview results as part of the this SCALE evaluation.
Section Analysis
Both BIO 122 and the introductory Chemistry sequence rely on CyberProf a great deal. Yet the
students’ responses to the summative questions on the survey for BIO 122 were extremely
positive, much more so than in the introductory chemistry courses. This lends credence to the
idea that some of the negatives in the Chemistry survey were due to server problems rather than
the teaching itself.
In all the courses discussed in this section, teaching assistant productivity is increasing via ALN,
either because grading has become automated or because presentation has been moved online.
That Microbiology allows much of the productivity increase to be captured by the students while
both Chemistry and Economics are using the productivity to reduce costs from their end raises an
interesting further point. Who captures the gain from the productivity increase – the instructor,
the department where the course is housed, or the students in the course, seems to be a matter of
choice rather than to be intrinsic to the teaching approach. As it becomes increasingly believed
that such productivity gains are feasible, more attention should be paid to the issue of capturing
the gains.
VI. GENERAL FINDINGS
Four of the courses: SPAN 210, ECON 300, STAT 100, and ECE 110 used Mallard for at least
some of the on-line assignments. Two others, the introductory chemistry sequence and BIO 122
used CyberProf in a similar manner. And CHEM 331 used WebCT, a commercially available
product that also allows for automated grading of quizzes. There seem to be several benefits
from use of this type of software in a large ALN course:
1. Students liked being able to take the quizzes when it was convenient for them rather than at a
pre-specified time.
2. Students liked being able to retake quizzes immediately when they got something wrong.
They could learn from their own mistakes. Since the course credit they received was based
on successful completion of the quizzes only, not the number of tries it took to reach success,
this type of learning was encouraged.
3. Students liked that the quizzes had deadlines. The deadlines imposed a certain discipline on
them, forcing them to keep up with the course material.
There were also some complaints about how the software itself and some about how instructors
authored material for it:
1. Instructors would often put in hints to be accessed if students had difficulty with the
questions. In some cases students reported the hints were vague and not helpful. In other
cases no hints were available.
2. Students reported that they could get the right answer to many questions by guessing rather
than by reasoning through the problem.
3. In ECE 110 and in the introductory chemistry courses, there were numerous complaints about
delays in getting back a response from the server after submitting an answer.
These
complaints didn’t come up in the other courses.
57
JALN Volume 2, Issue 2 - September 1998
The first complaint might be dismissed if students were in fact seeking help from classmates,
TAs, or the instructor when they were stuck on a question, especially if the instructor had
deliberately designed the hints to encourage such dialog. It seems clear, however, that at least
some students have an expectation that they should be able to do these on-line quizzes without
seeking outside help.
The second complaint points to a broader question. Are these on-line quizzes educative or makework? If the quizzes are make work, is this a fundamental problem with this type of courseware?
Or is the problem with the way the material has been authored – not being sufficiently attentive to
how the students respond when taking the quizzes? If the latter, we are probably understating our
development costs in that there needs to be even further rewriting of the on-line material.
To the extent that the delays in response could be resolved if there were more servers, the third
complaint suggests that we may again be underestimating our fixed costs of ALN instruction. If
these delays are inherent in the courseware design, when used in large classes where the question
type is such as to impose a heavy computation burden on the server, there is another issue. Could
other courseware be written with similar functionality and yet not produce these delays?
All the courses used computer conferencing to some degree. The students’ like or dislike of
conferencing seemed to relate directly to how much feedback they got from the instructor and/or
the TAs. In some cases there was sufficient feedback that class size didn’t seem an issue. In
other cases feedback was scarce and there were student complaints about the course being too
impersonal. Interestingly, it seems that human feedback per se is the concern. Students did not
seem to insist that the feedback come from the professor.
Here are some of the faculty responses to the question: What is your philosophy on limiting or
not limiting the number of times students can take the ALN quizzes/homeworks? These
responses seemingly parallel those of the students in the view of pedagogic benefit but raise
other concerns about their limitations, cheating and grade inflation.
• There is the question of whether or not this is inflating the grades. The other question is how
do you motivate people. These quizzes in a sense force motivation, force the students to do
the work. Unless they found someone who would do it for them. But then they had problems
taking the test.
I’m a little worried that we are not doing enough to discourage cheating. The scores are not
on a curve so one good buddy has no incentive not to help his other good buddy or to take
money from him to get a substantial part of the grade in the course for his friend.
• I let them take them over and over. I think the more subjected they are to the material the
better. I’ve also noticed that students are contacting me more about problems that they miss
now that they get immediate feedback. Before when students would submit homework and get
it back a week later, they didn’t really care any more what the right answer was. Now I get
e-mails from students who say they are trying the problem right now and getting the wrong
answer, they ask what they are doing wrong. I think this is good because they are thinking
about the questions more and why they are getting them wrong. This is a direct result of the
immediate feedback from using Mallard as opposed to the delayed feedback of having
traditional homework.
58
JALN Volume 2, Issue 2 - September 1998
Also, they are getting subjected to the material more. It doesn’t matter if they are trying it
over and over again. The point is that they are looking at it again and again. Doing the
work again and again.
• My overwhelming motivation to allow them to take them over and over again is that I am
subjecting them to the material. If they get it wrong and they look at it again, then they have
had more contact than if they simply turned it in and found out later they had missed the
question.
Grade inflation is not an impossibility. But I think the more correct answer is that they have
been subjected more to the material. On the subject of: is it less fair that my students receive
a higher grade partially due to the fact they have a limited amount of points that are almost
guaranteed than students in other non-ALN sections - I think academic freedom, faculty
members ability to decide how to structure their course, is a common standard. My students
are subjected more to the material. It’s not that it is unfair.
VII. CONCLUSION
We have tried to present the evidence in a balanced light. Certainly, we wish there were more
comparative evidence to present, whether it entailed student performance or student attitudes, as
we feel more confident weighing in on the “holding quality constant” issue with that type of
evidence.
The performance evidence that we do have suggests the students in the large ALN classes are not
harmed and may even benefit from the ALN approach, relative to their peers in a traditionally
taught and traditionally sized class.
The comparative attitudinal evidence strongly suggests that what students do in an ALN course is
different from what they do in a traditional course. It is by no means a trivial matter to compare
course quality when the student activities change. In particular, in both the Spanish Grammar
course and the Differential Equations course, the ALN sections entailed a lot more self-study and
a lot less interaction with the instructor. Is that good or bad? Does that depend on the students
taking the course? The evidence suggests that it is good for at least some students. Where we do
not have comparative evidence, the responses to the summative questions suggest that most
students like ALN when taught in a large class setting.
Furthermore, the above conclusions seem to hold for courses in each of the cells of the Efficiency
Matrix. This suggests that there are many possible courses that could be targets of like-minded
projects and that there are an array of alternative approaches that can be utilized to achieve the
type of results we have found with the SCALE projects.
We are buoyed by these results. While not providing absolute confirmation, it supports our view
that when a sensible pedagogic approach is embraced that affords the students with avenues to
communicate about their learning, ALN can produce real efficiency gains in courses without
sacrificing the quality of instruction.
59
JALN Volume 2, Issue 2 - September 1998
REFERENCES
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
(http://cyber.ccsr.uiuc.edu/CyberProf/)
(http://www.cen.uiuc.edu/Mallard/)
Ory, J. C., Bullock, C.D., and Burnaska, K.K., “SCALE Fall 95 Evaluation Results,”
http://w3.scale.uiuc.edu/scale/evaluations/fall95/index.html, 1996.
Ory, J. C., Bullock, C.D., and Burnaska, K.K., “SCALE Spring 96 Evaluation Results,”
http://w3.scale.uiuc.edu/scale/evaluations/spring96/index.html, 1996.
Ory, J. C., Bullock, C.D., and Burnaska, K.K., “SCALE Fall 96 Evaluation Results,”
http://w3.scale.uiuc.edu/scale/evaluations/fall96/index.html, 1997.
Ory, J. C., Bullock, C.D., and Burnaska, K.K., “SCALE Spring 97 Evaluation Results,”
http://w3.scale.uiuc.edu/scale/evaluations/spring97/index.html, 1997.
Wilson, J.M.,"The CUPLE Physics Studio," The Physics Teacher, 1994
Kashy, E., Thoennessen, M., Tsai, Y., Davis, N. E., and Wolfe, S. L., “Using Networked Tools to
Enhance Student Success Rates in Large Classes,” Proceedings of the Frontiers in Education,
http://fairway.ecn.purdue.edu/~fie/fie97/papers/1046.pdf, 1997.
Kashy, E., Thoennessen, M., Tsai, Y., Davis, N. E., and Wolfe, S. L., “Application of Technology
and Asynchronous Learning Networks in Large Lecture Classes,” 31st Hawaii International
Conference on System Sciences, Volume I, Collaboration Systems and Technology Track, page 321,
edited by Jay F. Nunamaker, Jr., 1997.
McClure, P. A., “‘Growing’Our Academic Productivity,” in Reengineering Teaching and Learning in
Higher Education: Sheltered Groves, Camelot, Windmills, and Malls, edited by Robert C. Heterick, Jr.,
http://cause-www.colorado.edu/information-resources/ir-library/abstracts/pub3010.html, 1993.
Noble, D. F., “Digital Diploma Mills: The Automation of Higher Education,” First Monday,
http://www.firstmonday.dk/issues/issue3_1/noble/, 1998.
Green, K., “The Campus Computing Project,” http://ericir.syr.edu/Projects/Campus_computing/,
1997.
Koob, R., “New Funding Paradigms: The Need for New Incentives,” NLII Viewpoint, Vol. 1, Issue 1,
http://www.educom.edu/program/nlii/articles/koob.html, 1996.
Twigg, C., “Academic Productivity: The Case for Instructional Software,”
http://www.educom.edu/web/pubs/pubHomeFrame.html, 1996.
Twigg, C., “The Need for a National Learning Infrastructure,”
http://www.educom.edu/program/nlii/keydocs/monograph.html, 1994.
Johnstone, D.B., “Learning Productivity: A New Imperative for American Higher Education,” NLII
Viewpoint, Vol. 1, Issue 1, http://www.educom.edu/program/nlii/articles/johnstone.html, 1996.
Massy, W. F., and Zemsky, R., “Using Information Technology to Enhance Academic Productivity,”
http://www.educom.edu/program/nlii/keydocs/massy.html, 1995.
http://www.tltgroup.org/
Arvan, L. “The Economics of ALN: Some Issues,” Journal of Asynchronous Learning Networks,
Volume 1, Number 1, http://www.aln.org/alnweb/journal/issue1/arvan.htm, 1997.
http://www.cba.uiuc.edu/~l-arvan/SCALEevalf97/
Arvan, L., “Bottom Up or Top Down, Using ALN to Attain Efficiencies in Instruction,” Proceedings
of the Third International Conference on Asynchronous Learning Networks,
http://www.aln.org/conf97/slide/arvan/arvan/index.htm, 1997.
Berube, M. “Why Inefficiency is Good for Universities,” Chronicle of Higher Education, Vol. 44,
Number 29, 1998.
Young, J. R., “Technology May Not Be All That Great, Say Professors at 'Second Look' Meeting,”
Chronicle of Higher Education, Vol. 44, Number 34, 1998.
http://www.cba.uiuc.edu/~l-arvan/SCALEevalf97/Part1-JALN-Efficiency.doc.
http://www.pathfinder.com/money/colleges98/collegecat98.html
http://www.csun.edu/sociology/virexp.htm.
Burnaska, K., “How the Implementation of Technology into Higher Education Affects Students'
Motivation, Resource Management Skills, Course Achievement Scores and Class Preparation,”
Unpublished Dissertation, 1998.
60
Download