Joe Cuseo - FYE Task Force 2012

advertisement
1
Early-Alert (Early-Warning) Programs:
Definition, Advantages, Variations & Illustrations
Joe Cuseo
jcuseo@earthlink.net
What defines an early-alert program?
Early alert system may be defined as a formal, proactive feedback system though which
students and student-support agents are alerted to early manifestations of poor academic
performance (e.g., low course grade at or before midterm) or academic disengagement (high
rates of absenteeism). It is unclear whether non-classroom-based indicators of students at risk for
attrition are being routinely used as part of early-alert systems (e.g., little or no contact with
academic advisors, or failure to register for following-term classes, failure to renew work-study,
financial-aid, or campus housing agreements, requesting transcripts before eligibility to graduate).
What is driving the growing interest in early-alert/early-warning programs?
Two key developments appear to account for why early-alert programs are proliferating:
(1) The rapid growth of technology-mediated commercial systems designed to facilitate the earlyalert process. These systems reduce the need for time-consuming, labor-intensive clerical work
and allow for delivery of immediate (“real time”) progress reports to students and student-support
professionals.
(2) The increasing number of academically underprepared and first-generation students entering
higher education; these students may not be ready to meet the academic expectations of higher
education or may lack the social capital (college knowledge) to succeed without close monitoring
and early support. Australian scholars, McInnis, James, and McNaught (1995), artfully articulate
the ethical responsibility of postsecondary institutions to provide support for these students:
There are first year students who do not understand the difference between school and
university, or who are so lacking in fundamental skills that they are not ready to take
responsibility for their learning. Admitting these students without providing adequate support
services and then criticizing them for failing to match up to expectations would be clearly a
case of blaming the victim (p. 8).
Are early-alert programs effective?
Empirical support for the effectiveness of early-support programs is slim and consists primarily
of single-institution studies involving small sample sizes and use of methodologies that are not
particularly rigorous. See Appendix A, p. 8, for a sample summary of campus-specific studies.
An early-alert system should build a pilot study into its initial program-development plan that
includes a plan to collect on faculty fidelity to program implementation and student
responsiveness to alert message. The plan should also include a strategy for assessing the
efficacy of different intervention strategies triggered by the system (e.g., their impact on students’
course or program persistence and performance). Failure to do so relegates the early-alert
system to serving merely as an early-identification-and-referral procedure, rather than a bona fide
student intervention-and-success-promoting system.
Although early-alert systems still lack a strong base of outcomes-based evidence, the earlyprocess does implement a number of theoretically sound principles of program delivery, namely:
a) Proactive delivery: early-alert programs deliver early feedback and take preventative action to
short-circuit student difficulties in an anticipatory fashion—before they require reactive (after-thefact) intervention or eventuate in student attrition. An early-alert system that ensures faculty
2
provide their students with some evaluation and feedback during the first four weeks of class is
not only an effective retention practice for at-risk students, it’s also an effective learning strategy
that benefits all students.
b) Intrusive delivery: early-alert programs initiate supportive action by reaching out to students
and bring support to them—as opposed to “passive programming” that waits for students to seek
out support on their own. Research indicates that student use of campus support services is
woefully low, particularly by students from families without a college-going tradition. Early alert
represents a process of intrusive, course-integrated student support that has the potential to
reach a larger number of students than passive, stand-alone support programming.
c) Targeted delivery: early-alert programs focus support on students those who need it the most,
i.e., students whose behavior indicates they are at risk for academic failure or course attrition.
This principle of targeted delivery is particularly important during times when budgets are tight
and resources are limited.
d) Personalized delivery: students’ motivation to succeed increases when they perceive they are
being noticed as individuals and that their personal success matters to the institution. Early alert
is an individualized form of student support that promotes student persistence by providing
personal attention and validation.
Advantages of Early Alert Relative to Predictive Modeling
Early-alert programs also have advantages over other approaches that attempt to predict atrisk students solely on the basis of their demographic characteristics or by measures of their precollege academic performance. Early-alert indicators are measures actual college behavior that
have been exhibited, observed, and documented; they are not attempts to infer or predict student
behavior on the basis of group affiliation or academic history in non-college settings. Groupbased approaches to identifying and supporting at-risk students also run the risk of “stereotype
threat”—a form of negative self-fulfilling prophecy that may be experienced by individual
members of a group that has been labeled “at risk”, which can result in a loss of self-confidence
or self-efficacy due to heightened awareness of their greater risk for failure (Steele, 1997).
Instruments designed to predict at-risk students at college entry are based on students’ selfreported responses. Although answering “yes” to a survey question about “intent to leave”
before graduation is generally a good predictor of student attrition, early-alert behavior goes
beyond prediction based on self-report to prediction based on observable (objective) behavior
indicating that the student is truly acting on intent—moving it from probability to actuality. Simply
stated, the best predictor of a student who is at risk for attrition is student initiation of behaviors
that will eventually lead to attrition. Thus, it follows that the best way to prevent attrition is to
intercept attrition-initiated behavior before it eventuates in actual withdrawal.
An early-alert program can be used to identify at-risk behavior manifested in different campus
contexts, such as poor performance in the classroom, disengagement outside the classroom, or
behavior indicating intent to discontinue enrollment failure (e.g., failure to register for next-term
classes, failure to renew financial aid or student housing, or requesting copies of transcripts
before eligibility to graduate). If early alert is defined broadly to include in-class and out-of-class
indicators of potential withdrawal, the program acquires the potential to involve multiple campus
offices and multiple members of the college community in the retention process. “Retention is
everybody’s business” and successful retention requires a “total institutional response” have
become truisms in the retention field. Postsecondary research supports these truisms by
demonstrating that campus programs aimed at increasing student retention are more effective
when Academic and Student Affairs collaborate to design and deliver these programs (Stodt &
Klepper, 1987). For example, in a national research project designed to document effective
educational practices (Project DEEP), it was discovered that a high degree of respect and
collaboration between Academic and Student Affairs typifies institutions with higher-thanpredicted graduation rates (Kuh, et al., 2005). Similar results were obtained from an in-depth
study of state universities with higher-than-average graduation rates, which revealed that one
distinctive feature of high-performing institutions was campus-wide coordination of retention
efforts that stimulated communication and cooperation between Academic and Student Affairs
(AASC&U, 2005).
3
Furthermore, student-support agents are likely to feel more comfortable intervening with a
student who has exhibited concrete actions or specific behaviors that can be referred to and used
as focal points for discussion and modification. Intervention can be awkward when a support
agent is working with a student who has the potential to be at risk (e.g., based on pre-college
performance or by being a member of an at-risk group) but has yet to demonstrate any behavior
indicating they actually are at risk. This may be comparable to presuming a student to be guilty
(of engaging in at-risk behavior) solely on the basis of personal characteristics or prior history—
without having any current evidence that the student is doing anything wrong (risky).
The point here is not to pooh-pooh the value of using demographic data and at-risk prediction
instruments at college entry. Such prognostic information can be very useful; however, this
information needs to be augmented or corroborated by individual diagnostic data and
personalized intervention strategies.
Types or Varieties of Early-Alert Programs
Early-alert programs may be classified into three major categories: (a) midterm-grade reports,
(b) scores on at-risk prediction instruments, and (c) pre-midterm behavioral warning systems.
Midterm-Grade Reports
Issuing grades at midterm probably represents the first and most commonly used practice for
alerting students proactively about poor academic progress. One national survey revealed that
more than 60% of postsecondary institutions report midterm grades to first-year students for the
purpose of providing them with early feedback on their academic performance. Approximately
10% of these institutions obtain right-to-privacy waivers that allow reporting of midterm grades to
both first-year students and their parents (Barefoot, 2001). Students with dangerously low
midterm grade reports are typically notified by letter to speak with an institutional representative
(e.g., academic advisor or academic dean) who, in turn, refers the notified student to the
appropriate support service. At some institutions, such as New York University, academic
advisors make follow-up phone calls to students who fail to respond to their initial letter of
notification (Early Intervention Programs, 1992). At Brooklyn College (NY), faculty notify peer
tutors when students are having academic difficulties and the tutors initiate contact with the
student (Levitz, 1991).
Although use of midterm grades as an early-feedback system has a long history of use in
higher education, it also has a long history of implementation limitations and obstacles. These
limitations and obstacles are described below, along with potential solution strategies for
ameliorating them.
1. Lack of faculty compliance—i.e., faculty unwillingness calculate and formally report
midterm grades for all students in all their courses.
A national survey of early-alert programs revealed that lack of faculty buy-in, input, and timely
delivery of early-alert messages was primary factor in limiting program effectiveness (Lee
Greenhouse Associates, 2008). Faculty resistance to computing and submitting midterm grades
may be minimized if instructors are not asked to submit midterm grades for all students, but only
for those students who are in academic jeopardy (e.g., students whose grades are C- or below).
Students’ midterm grades for one course in particular—the first-year seminar (a.k.a., first-year
experience course)—may have the potential to serve as a vehicle for early identification of firstterm students who may be at risk for academic failure and attrition. (See Appendix B, p. 10, for
further details and supporting evidence.)
Faculty compliance rates may also be increased by increasing the convenience of the gradereporting procedure (e.g., easy-to-complete grade forms or on-line grade submission). Lastly,
instructors may be expected to show higher rates of compliance if they are recognized or
rewarded for doing; for instance, if department chairs and academic deans “count” their record of
compliance in promotion-and-tenure decisions.
2. Midterm grade reports are not sufficiently proactive, i.e., they often are received and
acted upon in time to make a significant improvement in the student’s course grade.
Issuing midterm-grade reports to struggling students is a laudable practice, but as Tinto
(1993) warns, by the time midterm grades are recorded and disseminated, feedback may come
too late in the term to be useful or improving course performance. However, midterms grades
4
may still be a useful way to alert the student (and advisors) of the need to withdraw from a course
in which a midterm grade is extremely low. This advantage of midterm grade reports may be
maximized if instructors are asked to report not only the student’s grade, but also what
percentage of the student’s final grade it represents.
3. Reporting a grade at midterm does not specify the source (cause) of the poor
performance, fails to suggest what specific action the student should take, and does not
suggest what particular intervention strategy a student-support agent should take to
rectify the problem.
Effective performance-enhancing feedback should be: (a) proactive—delivered early in the
learning process to allow for early error detection and correction, and (b) precise--specify clearly
what needs to be done to rectify errors and improve subsequent performance. Midterm grades,
although useful for informing decisions about whether or a student should persist in or withdraw
from a course, represents feedback that is typically neither proactive nor precise enough to be
used by students to improve their course performance and final course grade.
Pre-Midterm Behavior Warning Systems
Growing awareness of limitations of midterm grades for providing feedback early enough to
improve course performance has led to more interest in the use of pre-midterm, early-alert
systems. Identifying and connecting with students who exhibit disengagement very early in the
term—before midterms grades are calculated, processed, and disseminated—represents a more
proactive alert system. Campuses are relying more on earlier feedback mechanisms, based on
student behavior during the first 2-6 weeks of the term (e.g., students who miss class regularly,
who are chronically tardy, who consistently fail to turn-in their assignments, or who rarely are
prepared for planned class activities). For instance, at New Mexico State University, attendanceproblem requests are sent to instructors during the second week and sixth week of the term.
Students demonstrating attendance irregularities falling into any of the following categories
receive a phone call from the Office of Advisement Services: (a) first-semester students, (b)
students on academic probation, and (c) students with multiple early-alert reports (Thompson,
2001).
Other colleges and universities issue early-alert forms that request additional information from
the instructor that is used to help diagnose the specific nature of the problem and facilitate
intervention that is tailored or customized to its particular cause. To increase compliance with this
request, report forms are becoming increasingly “user friendly,” ensuring that completion of them
is neither time-consuming nor labor-intensive. For instance, at Adelphi University (NY), earlywarning rosters are released during the fourth week of class and faculty report students who are
experiencing academic difficulty, using an efficient abbreviation code to identify the specific
area(s) of weak performance: AP = Assignment Performance, CP = Class Participation, EX =
Examination Performance, IA = Intermittent Attendance, NA = Never Attended, NC = NonCompleted assignments, and WE = Weak Expository skills (Carlson, 2000).
At Marymount College (CA), the offices of Academic Affairs and Student Development
Services collaborate to identify and intercept academic problems during the early weeks of the
term through a program titled, “R.E.T.A.I.N,” an acronym standing for: Re-Engagement Through
Academic Intervention Now. Easy-to-complete forms are placed in faculty mailboxes that may be
used to identify students exhibiting early behavioral signs of disengagement. Faculty are given
the option of sending these forms to the Assistant Academic Dean, or contacting the Dean by
electronic/voice mail to report students exhibiting early “red flag” behavior. Particular attention is
paid to students for whom more than one R.E.T.A.I.N form has been submitted.
At North Central State College (OH), the COCO information system is used to facilitate the
early-alert process. This computer system allows faculty access their class rosters through a
website and faculty portal. If faculty wish to send an early alert to any student at any time during
the term (first week through the last), they simply check a box next to the student’s name on the
website roster. This takes the faculty member to another page where s/he checks the problem
(non-attendance, poor homework, poor tests, other), types in notes if needed, and sends it. The
electronic message goes to three places: (1) to the student’s e-mail, (2) to the college’s Student
Success Center, and (3) back to the faculty member who originally sent it. An advisor in the
Student Success Center then follows up with a phone call, email, or letter to the student to
5
discuss options. The system was initially intended for use only during the first half of the term;
however, faculty liked it so well, they asked for it to be available throughout the term (Walker,
2005).
In recent years, there has been rapid proliferation of technology-mediated early alert products.
See Appendix C (p. 12) for sample descriptions of these commercial products, and see Appendix
D (p. 14) for criteria that might be used to evaluate early-alert software programs.
At-Risk Prediction Instruments
Instruments have been developed to identify at-risk students by assessing their self-reported
attitudes and behaviors at college entry (e.g., at orientation or during the first week of class).
These instruments may also be administered at some point after onset of the academic term
(e.g., 2-4 weeks), thus allowing them to function as a quasi-early-alert program by using
indicators based on students’ self-reported attitudes and habits, rather than observations of
student behavior or performance. See Appendix E, p. 15, for sample descriptions of these
instruments.
6
References
AASCU (American Association of State Colleges & Universities) (2005). Student success in state
colleges and universities. Retrieved January 21, 2008, from
http://www.aascu.org/GRO/docs.htm
Anderson, C. & Gates, C. (2002). Freshmen absence-based intervention at The University of
Mississippi. Paper posted on the First-Year Assessment Listserv (FYA-List), August 8, 2002.
Retrieved from http://www.brevard.edu/fyc/remarks/andersonandgates.htm
Barefoot, B. O. (2001, July) “Summary of Curricular Findings.” Retrieved August 12, 2001, from
http://www.brevard.edu/fyc/survey/currentpractices/summaryoffindings.html].
Barefoot, B. O., Warnock, C. L., Dickinson, M. P., Richardson, S. E., & Roberts, M. R. (Eds.)
(1998). Exploring the evidence, Volume II: Reporting outcomes of first-year seminars.
(Monograph No. 29). Columbia, SC: National Resource Center for The First-Year Experience
and Students in Transition, University of South Carolina.
Budig, J., Koenig, A., & Weaver, T. (1991). Postcards for student success. Innovation Abstracts,
12 (28), 4.
Carlson, Linda [carlson@adelphi.edu]. “Early Alert Programs,” Message to fye-list [fyelist@vm.sc.edu]. Oct. 2, 2000.
“Early Intervention Programs Help Keep New Students on Course” (1992). Recruitment and
Retention in Higher Education, 6(3), p. 9.
Fidler, P. P., & Shanley, M. G. (1993, February). Evaluation results of University 101.
Presentation made at the annual conference of The Freshman Year Experience,
Columbia, South Carolina.
Geltner, P. (2001). The characteristics of early alert students. Santa Monica, CA: Santa
Monica College. (ERIC Document Reproduction Service No. ED463013).
Green, M. G. (Ed.) (1989). Minorities on campus: A handbook for enhancing diversity.
Washington, D.C.: American Council on Education.
Kuh, G. D., Kinzie, J., Schuh, J. H., Whitt, E. J., & Associates (2005). Student success in
college: Creating conditions that matter. San Francisco: Jossey-Bass.
Lee Greenhouse Associates. (2008). Study of the market for retention services—Findings report.
Cincinnati, OH: Author.
Levitz, R. (1991). Adding peer tutors to your retention program. Recruitment and Retention in
Higher Education, 5(10), pp. 5-7.
McInnis, C., James, R., & McNaught, C. (1995). First year on campus: Diversity in the initial
experience of Australian undergraduates. Melbourne: Centre for the Study of Higher
Education.
Steele, C. M. (1997). A threat in the air. How stereotypes shape intellectual identity and
performance. American Psychologist, 52(6), 613-629.
Stodt, M. M. & Klepper, W. M. (Eds.) (1987). Increasing retention: Academic and student affairs
administrators in partnership. New Directions for Higher Education. San Francisco: JosseyBass.
Thompson, K. (2001, April 4). “Early Warning Systems,” Message to fye-list (fye-
7
list@vm.sc.edu). [kthompso@cavern.nmsu.edu]
Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition (2nd ed.).
Chicago: University of Chicago Press.
Walker, B. (2005, Nov. 16). “Excessive Absences From Classes,” Message to fye-list (fyelist@vm.sc.edu). [bwalker@ncstatecollege.edu]
8
Appendix A
Colleges and Universities Reporting Positive Outcomes
Associated with Early-Alert (Pre-Midterm) Programs
Virginia Commonwealth University
Every October, the university collects early-semester grades from instructors who teach 100 or
200-level courses and then notifies students receiving grades of D or F. Following this
notification, advisors contact the students by phone or e-mail to schedule an intervention session,
during which advisors address student issue and recommend strategies.
According to assessment an assessment report issued in 2002, the Early Alert Intervention
Program: (a) decreased the number of students placed on academic warning the following term,
(b) increased the percentage of students receiving final course grades of C or better and (c)
increased the first-to-second semester retention rate (National Academic Advising Association,
1990-2010).
University of Mississippi
Faculty reported absences electronically for students in different sections of a first-year English
course. Students with multiple absences were identified for personal or telephonic intervention.
The results of this pilot study revealed that (a) there was a direct correlation between students’
class attendance and their academic success (GPA), and (b) students with multiple absences
who received intervention demonstrated greater academic success (higher GPA) than students
who did not receive intervention (Anderson & Gates, 2002).
University of Wisconsin–Oshkosh
After the third of week of the semester, early-alert forms are sent to instructors teaching
preparatory and basic-skill courses populated by previously identified “high-risk” students. Forms
are sent to the Office of Academic Development Services, which initiates intrusive intervention by
contacting and meeting with each student to provide academic counseling, referral to a peer tutor
program, and suggestions for other forms of assistance. Since the program was initiated,
retention rates for at-risk students have risen steadily, reaching a level of more than 70 percent
(Green, 1989).
Vincennes University Junior College (Indiana)
When a student begins to miss class at this institution, course instructors tear off one part of a
computer-generated ticket whose keystroke input generates two postcards containing messages
of concern about non-attendance, one of which is addressed to the student’s local residence and
one to the student’s permanent address. Additional absences generate a second, more strongly
worded postcard indicating that the student is in danger of being dropped from the course. The
system also generates lists for academic advisors, alerting them of students majoring in their
academic field who have received attendance notifications.
Following institutional implementation of this early-alert system, the number of students
receiving grades of D, F, or W was substantially reduced. The beneficial effect of the early-alert
system was particularly pronounced in developmental mathematics classes, for which there was
a 17% drop in D and F grades and a concomitant 14% increase in A, B, and C grades (Budig,
Koenig, & Weaver, 1991).
Rudmann (1992) conducted a study whereby students referred to early-alert services by
professors were randomly placed into one of three groups: (1) those receiving an alert letter that
identified and described resources for improvement, (2) those receiving a letter requesting them
to meet with their advisor about the early and to discuss strategies, and (3) those on an earlyalert list that received no contact (the control group). A comparison of the academic performance
of students in these three groups revealed that students who met with their advisor achieved
higher course grades than those who were simply notified about campus resources. Also,
students who received only a letter achieved higher final course grades than those who received
no contact whatsoever.
9
References
Anderson, C. & Gates, C. (2002). Freshmen absence-based intervention at The University of
Mississippi. Paper posted on the First-Year Assessment Listserv (FYA-List), August 8, 2002.
Retrieved from http://www.brevard.edu/fyc/remarks/andersonandgates.htm
Budig, J., Koenig, A., & Weaver, T. (1991). Postcards for student success. Innovation Abstracts,
12 (28), 4.
Green, M. G. (Ed.) (1989). Minorities on campus: A handbook for enhancing diversity.
Washington, D.C.: American Council on Education.
National Academic Advising Association (1990-2010). Early alert intervention program at Virginia
Commonwealth University. Retrieved December 12, 2010 from
www.nacada.ksu.edu/programs/Awards/archive/ars.htm
Rudmann, J. (1992). An evaluation of several early alert strategies for helping first semester
freshmen at the community college and a description of the newly developed early alert
software (EARS). Irvine, CA: Irvine Valley College. (ERIC Document Reproduction Service
No. ED 349 055)
10
Appendix B
First-Year Seminar Grade as an Early-Alert Indicator
Empirical support for the potential of first-year seminar course grades to serve as an early-alert
mechanism is provided by institutional research conducted on four consecutive cohorts of firstyear students at the Massachusetts College of Liberal Arts. This campus-specific student
revealed that first-year seminar grade can predict students’ overall first-year academic
performance better than high school grades or college-entry SAT/ACT scores (Hyers & Joslin,
1998). Similarly, at Floyd College—a public community college in Georgia, institutional research
indicates that a significant correlation exists between first-year seminar grade and subsequent
GPA (Green, 1996). Other campus-specific studies have shown that the specific grade earned by
students in its first-year seminar correlates significantly with student retention (Raymond &
Napoli, 1998; Starke, Harth, & Sirianni, 2001). These findings suggest that the course can serve
as an accurate diagnostic tool for identifying first-term students who may be academically at-risk
and in need of academic assistance or psychosocial intervention.
Such findings suggest that students’ academic performance in the first-year seminar may be
predictive of their general academic performance and persistence in their first year of college. If
this is the case, then campuses could target intervention procedures that are tailored specifically
for beginning students who perform poorly in the seminar, allowing the course to function as a
prognostic and diagnostic assessment tool for early identification and interception of academic
problems (and potential attrition) during the first year of college. The seminar could perform this
diagnostic function in a particularly proactive manner if the course concluded before the end of
the term, allowing students’ grades to be formally recorded and made accessible to advisors and
other student-support or intervention agents while students are still enrolled in other classes. This
strategy is used at The Ohio State University, Wooster Campus, where the seminar is offered
during the first five-weeks of the semester. Institutional research on this campus demonstrates
that student grades in the course are better predictors of their success at the college than high
school rank or ACT score; and since these grades are known after the fifth week of the term,
early identification and intervention is possible (Zimmerman, 2000).
For seminars that do not conclude before the end of the term, course instructors could
generate midterm grades (or pre-midterm progress reports) to students experiencing these
problems, which could be circulated to academic advisors or academic-support professionals.
First-term students receiving grades below a certain threshold or cutoff point in the seminar may
then be contacted for consultation and possible intervention. To determine this cutoff point,
assessment could be conducted on grade distributions in the seminar to identify the grade below
which a relationship begins to emerge between poor performance in the course and poor firstyear academic performance or attrition. For instance, at the Massachusetts College of Liberal
Arts, it was found that students who earned a grade of C+ or lower in the seminar had a
significantly higher rate of first-year attrition (p<.001) than students who earned a grade of B- or
higher in the course (Hyers & Joslin, 1998).
As previously mentioned, use of midterm grades as an “early alert” or “early warning” system is
nothing new to higher education, but a perennial problem associated with its successful
implementation is lack of compliance—faculty teaching traditional introductory courses may have
neither the time for, nor the interest in, calculating and reporting midterm grades for all their
students. However, if the first-year seminar grade proves to be an accurate proxy for first-year
academic performance in general, then the midterm grade in this single course may serve as an
effective and efficient early-warning signal. Moreover, given that first-year seminar instructors
often self-select into the program because of their interest in, and concern for promoting the
success of first-year students, they should display a high rate of compliance or reliability with
respect to submitting students’ midterm grades in an accurate and timely manner.
11
References
Green, J. T. (1996). Effectiveness of Floyd College Studies 101 on subsequent student success.
Unpublished doctoral dissertation, University of Alabama. In. B. O. Barefoot, C. L. Warnock,
M. P. Dickinson, S. E. Richardson, & M. R. Roberts (Eds.), Exploring the evidence, volume II:
Reporting the outcomes of first-year seminars (pp. 9-10). National Resource Center for The
First-Year Experience and Students in Transition, Columbia, SC: The University of South
Carolina.
Hyers, A. D., & Joslin, M. N. (1998). The first-year seminar as a predictor of academic
achievement and persistence. Journal of The Freshman Year Experience & Students in
Transition, 10(1), 7-30.
Raymond, L. & Napoli, A. R. (1998). An examination of the impact of a freshman seminar course
on student academic outcomes. Journal of Applied Research in the Community College, 6, 2734.
Starke, M. C., Harth, M., & Sirianni, F. (2001). Retention, bonding, and academic achievement:
Success of a first-year seminar. Journal of The First-Year Experience & Students in Transition,
13(2), 7-35.
Zimmerman, A. (2000). A journal-based orientation course as a predictor of student success at a
public two-year technical college. Journal of The First-Year Experience & Students in
Transition, 12(1), 29-44.
12
Appendix C
Sample Descriptions of Early-Alert Technology Products
* Note: The following list of early-alert software products is not meant to be an exhaustive or even
comprehensive; it represents merely a small sample of an increasingly large population of
commercial systems. The products chosen for inclusion in this sample are not necessarily
deemed to be superior to those unmentioned, nor does the order in which the products are listed
reflect any ranking in terms of quality or effectiveness.
----------------------------------------------------------------------------------------------------------------------------- ----
MAP Works (Making Achievement Possible) (EBI)
This early-alert system was developed at Ball State University and later copyrighted by EBI
(Educational Benchmarking Inc.) It consists primarily of a Transition Survey sent to first-year
students at about the third week of the semester that includes questions relating to two general
areas:
1) Academics
* Reading, writing, speaking, and study skills
* Computational and analytical skills
* Learning self-assessment
* Factors interfering with class attendance
* Time management
2) Student Development
* Health, wellness, and stress
* Encouragement and support
* Sense of belonging
* Social life (e.g., roommates, neighbors, campus life)
* Homesickness
* High School involvement
Results are benchmarked to allow individual students to compare their results their first-year
cohort, and are reported in such a way that students readily see discrepancies between their
expectations and their behaviors.
---------------------------------------------------------------------------------------------------------------------------------
Signals - Stoplights for Student Success (Purdue University) (SunGard)
To increase the success of their students Purdue, University implemented an intervention
system called Signals, which was subsequently copyrighted by SunGard. Using historical
correlations between students’ behavior patterns on Blackboard, the system combines predictive
modeling and the data mining of their Blackboard Vista system to identify students who are
academically at risk—starting as early as week two of the academic term and based on such
behaviors as low exam scores, missing or late assignments, and frequency and recency of
students’ online use of course discussion posts and content pages. Risk ratings are assigned,
integrated with Blackboard and made available on the student’s Blackboard homepage.
Using a predictive student success algorithm, each student is assigned to a group based
one of three stoplight ratings (red, yellow or green), which can be released on students'
Blackboard homepage. Intervention emails composed by the instructor can be released to
each risk group. The Signals system also communicates to students about faculty office hours
and support services on campus.
One noteworthy feature of Signals is that it is a course-specific system that allows individual
instructors to set their own at-risk indicators and communicate directly with students enrolled
in their courses.
---------------------------------------------------------------------------------------------------------------------------------
13
Starfish Early Warning and Student Tracking System
Key features:
* Interfaces with any course-management used on campus (e.g., Blackboard, WebCT, or Angel)
* Allows input to be automatically submitted by instructors, advisors, and campus staff (e.g.,
residential staff and athletic coaches), thereby promoting a holistic approach to promoting
student success.
* Automatically raises flags for students based on academic behaviors (e.g., grades, assignment
tardiness and online course-related activity) and non-academic behaviors (e.g., changes in
appearance or disruptive out-of-class behavior)
* Threshold for raising flags is set by the campus
* Based on the type of flag raised, sends email notifications to all members of the campus
community (without violating FERPA) whom the college identifies as being in a position to help
the alerted student
* Allows comparative assessment of student outcomes relative to benchmark institutions.
----------------------------------------------------------------------------------------------------------------------------- ----
Hobsons’ Early Alert
System highlights:
* Employs an efficient communications module, allowing for easy delivery and storage of
automated e-mail messages to students, instructors and advisors, which contain information
about academic, social, behavioral and financial concerns and resources.
* Includes a strategic survey designed to assess and address student challenges, which allows
campuses to collect data and track students, and provides information to the campus about
areas in which student-support services need to be improved or added.
---------------------------------------------------------------------------------------------------------------------------------
FastTrack
Consists primarily of two surveys/inventories:
1) Partners in Education Inventory (PEI) administered at orientation or during their first class
session of the first-year seminar, on which students self-assess their strengths, weaknesses,
work status, and programs or services that they think may help them.
2) Student Experiences Inventory (SEI) administered at midterm, which checks for changes in
students’ self-reported status and perceived needs for services that emerged during the
term.
Based on students’ responses to these surveys, emails are sent to appropriate offices (e.g., to
the Counseling Office if the student indicated that personal counseling might be helpful). Students
also receive an email that identifies the support service, contact information, and a website link.
≉
14
Appendix D
Possible Criteria for Evaluating Early-Alert Software Programs
1. “Fit”
* Student Fit
Many software products contain features that are geared toward full-time, traditional-age
college students at residential campuses. How effectively do the system’s features accommodate
the issues and risk factors associated with your commuter students, non-traditional age students,
and students who are enrolled on a part-time basis? Also, is the technology used by the system
readily accessible to your students and likely to be used by them?
* Faculty Fit
Many software programs require faithful data entry by course instructors. Is the technology
familiar to, and likely to be used by, your faculty (full-time and part-time)?
* System Fit
Does the software interface smoothly with existing technological systems currently in place on
campus?
----------------------------------------------------------------------------------------------------------------------------- ----
2. Program Flexibility/Adaptability
* Can the program be easily modified or adapted to accommodate campus-specific questions and
issues? (For example, Is the system conductive to convenient, customized inclusion of different
support strategies and services, depending on what student behavior(s) precipitated the earlyalert notice?
* Does the program allow early-alert message to be personalized and customized to the specific
type of at-risk behavior exhibited by the student (e.g., nature of advice provided, particular
support person to be contacted)?
* Does the system allow multiple members of the college community to submit early-alert
messages for at-risk behaviors that are not classroom-based and for “non-academic” behaviors
that indicate intent to withdraw—either immediately or at the end of the term (e.g., not
registering, not renewing financial aid, or requesting college transcripts)?
* Does the program allow for clear control over the recipients of the alert message, balancing the
need to preserve students’ right to privacy with the capacity to convey early-alert message to
any member of the college community who is well positioned to help the particular student who
is at risk?
---------------------------------------------------------------------------------------------------------------------------------
3. Student Tracking
* Is the system capable of following students longitudinally on a case-by-case basis from initial
notification of the student, the student’s response (if any) to the notification, and the nature of
subsequent action taken (e.g., support service utilized) and outcome achieved (e.g., modification
of at-risk behavior reported by the instructor)?
---------------------------------------------------------------------------------------------------------------------------------
4. Data Aggregation & Manipulation
* Can the data generated by the system be readily amalgamated and mined to (a) detect the
most common causes or recurrent reasons for early-alert notices, and (b) identify services to
which students are most frequently referred? The ability of the system to generate this
information may be invaluable for initiating programs or practices aimed at reducing the incidence
and frequency of future early-alert notices, thereby providing the campus with a potential weapon
for intercepting attrition more proactively and systemically.
----------------------------------------------------------------------------------------------------------------------------- ----
5. Capacity for Benchmarking and Comparative Analysis
* Does the system provide norms or reference points that allow for comparative interpretation,
analysis, and assessment of outcomes achieved by different interventions designed to address
students’ at-risk behaviors?
15
Appendix E
Instruments Designed to Identify “At-Risk” Students
The instruments listed below are designed primarily to identify at-risk students by assessing their
self-reported attitudes and behaviors at college entry (e.g., at orientation or during the first week
of class). These instruments may also be administered at some point after onset of the academic
term (e.g., 2-4 weeks), thus allowing them to function as an early-alert program that is based on
students’ self-reported attitudes and habits rather than on observations of their behavior or
performance.
----------------------------------------------------------------------------------------------------------------------------- ----
College Student Inventory (CSI) (Noel-Levitz)
Intended for administration during new-student orientation or the first week(s) of class, this
instrument asks students to respond on a 7-point Likert scale, ranging from “not at all true” to
“completely true,” to a series of statements that purportedly tap key cognitive and affective
indicators of potential attrition (e.g., “I am very strongly dedicated to finishing college—no matter
what obstacles get in my way.” “I plan to transfer to another school sometime before completing
my degree at this college or university.”) The inventory is available in two forms—short (30
minutes) and comprehensive (60 minutes), and three reports are generated to facilitate
individualized interventions. (www.noellevitz.com/rms)
---------------------------------------------------------------------------------------------------------------------------------
The College Success Factors Index (CSFI) (Hallberg & Davis)
Contains 80 self-scoring statements designed to assess college readiness during the early
weeks of college by measuring eight factors that are purportedly related to “academic aptitude,”
namely: (a) responsibility/control, (b) competition, (c) task precision, (d) expectations, (e)
wellness, (f) time management, (g) college/school involvement, and (h) family involvement. The
test is self-scored and same-day printouts can be generated that include individual students’ total
and subtest scores, plus a “watchline” or danger zone indicator for scores falling below the
statistical average. (www.csfi-wadsworth.com)
----------------------------------------------------------------------------------------------------------------------------- ----
The Student Readiness Inventory (SRI) (ACT)
The SRI is comprised of the following 10 scales:
1. Academic Discipline—the amount of effort a student puts into schoolwork and the degree
which the student is hardworking and conscientious.
2. Academic Self-Confidence—belief in one’s ability to perform well in school.
Commitment to College—one’s commitment to staying college and getting a degree.
3. Communication Skills—attentiveness to others’ feelings and flexibility in resolving conflicts with
others.
4. Emotional Control—one’s responses to and management of strong feelings.
5. General Determination—extent to which one strives to follow through on commitments and
obligations.
6. Goal Striving—strength of one’s efforts to achieve objectives and end goals.
7. Social Activity—one’s comfort in meeting and interacting with other people.
8. Social Connection—one’s feelings of connection and involvement with the college community.
9. Study Skills—the extent to which students believe they know how to assess an academic
problem, organize a solution, and successfully complete academic assignments.
For more information, go to http://www.act.org/sri/
---------------------------------------------------------------------------------------------------------------------------------
16
Transition to College Inventory (TCI) (Pickering, Calliotte, & McAuliffe)
This is an extensive survey that has been designed to assess “noncognitive” variables among
freshmen, such as attitudes, opinions, and self-ratings. These noncognitive variables are
designed to yield a “probation score” and are assessed by items measuring whether students’: (a)
have well-defined career plans, (b) plan to obtain a degree, (c) consider the university to be the
major focus of their lives, and (d) plan to work 11 or more hours per week during the first
semester.
When compared with the traditional cognitive measures used in the college-admission process
(i.e., standardized test scores and high school academic performance) with respect to their ability
to predict academic problems and attrition during the first year of college, the noncognitive
predictors alone resulted in higher “hit” rates (accurate predictions of academic or retention
problems) than did use of cognitive predictors alone. In fact, the cognitive predictors alone could
not correctly identify any first-year students in academic difficulty, nor could they correctly identify
any freshmen who did not return for their second year.
For information on this instrument, see: Pickering, J. W., Calliotte, J. A., & McAuliffe, G. J.
(1992). The effect of noncognitive factors on freshman academic performance and retention.
Journal of The Freshman Year Experience, 4(2), 7-30.
---------------------------------------------------------------------------------------------------------------------------------
EQ-i:S Post Secondary
The EQ-I is the first validated measure of emotional intelligence, and the first of its kind to be
peer reviewed in the Buros Mental Measurements Yearbook. Student scores on this instrument
have been empirically linked to student success in a number of published works.
For further information, go to www.mhs.com
---------------------------------------------------------------------------------------------------------------------------------
Anticipated Student Adaptation to College Questionnaire (ASACQ) (Baker &
Schultz).
This inventory asks students to rank their experience in college by using statements in four
subsets: (a) social adjustment, (b) academic adjustment, (c) personal adjustment, and (d)
institutional & goal commitment. An “intent-to-persist” score is generated by computing a
student’s average score for eight particular statements that have been embedded in the
instrument.
Baker, R. W., & Schultz, K. L. (1992). Measuring expectations about college adjustment.
NACADA Journal, 12(2), 23-32.
------------------------------------------------------------------------------------------------------------
Instruments Designed to Identify At-Risk Students’
Learning Habits, Attitudes, & Motivation
Learning and Study Strategies Inventory (LASSI) (Weinstein)
Also available in electronic form (E-LASSI)
A 77-item instrument that takes about 20 minutes to complete and contains subscales relating
to attitude, motivation, time management, anxiety, concentration, information processing,
selecting main ideas, study aids, self-testing, and test-taking.
A free sample packet is available upon request from the publisher:
hhservice@hhpublishing.com, which includes a self-scoring form, user’s manual, and pricing
information. For further info about the instrument, you can review the Web version, or you could
download a copy of the User’s Manual (PDF) from the following link:
http://www.hhpublishing.com/_assessments/LASSI/samples.html
------------------------------------------------------------------------------------------------------------
17
Motivated Strategies for Learning Questionnaire (MSLQ) (Pintrich &
McKeachie)
Similar to the LASSI, the MSLQ is a self-report questionnaire that takes about 25 minutes to
complete. It’s based on the same general information processing model as the LASSI, but there
are a few differences: (1) The motivational scales are based on a general social-cognitive
approach to motivation that includes three key components: values, expectancy, and affect. (2) It
organizes its cognitive scales into general cognitive strategies and metacognitive strategies
(a.k.a., “executive processes" that plan and direct learning. (3) Its last general category of scales
includes resource management factors.
(For further information about this instrument, contact the Department of Psychology, University
of Michigan.)
-----------------------------------------------------------------------------------------------------------Study Behavior Inventory (SBI)
This instrument is designed to efficiently evaluate students’ learning skills and attitudes, as well
as refer them to campus-specific personnel and programs that can be of help. It can be
completed in less than 15 minutes and consists of 46 self-report items relating to (a) academic
confidence (e.g., self-esteem & locus of control), (b) short-term study behaviors (e.g., note-taking
& reading), and (c) long-term study behaviors (e.g., exam preparation and writing research
papers). It also provides “proficiency statements” on students’ performance in specific areas
relating to student success, namely: time-management, study-reading, general study habits,
listening/note-taking, writing, test-anxiety, test-taking, and faculty relations.
(To access more extensive info on this instrument, go to the following website:
www.sbi4windows.com.) (Information on this instrument also appeared in the Journal of
Developmental Education, 21[2] [Winter], 1997.)
-----------------------------------------------------------------------------------------------------------Achievement Motivation Profile (AMP) (Mandel, Friedland, & Marcus)
A less well-known instrument designed to assess academic motivation, and to detect
“academic underachievers.” Its content derives primarily from research reported in the book,
Psychology of Underachievement by Mandel & Marcus. It was originally developed for use with
high school students but users claim it is equally applicable to college students.
The instrument is said to be available from WPS (Western Psychological Society), or contact
Peter Walsh, who works in learning support services at Ryerson Polytechnic University in
Toronto, Canada (pwalsh@acs.ryerson.ca)
-----------------------------------------------------------------------------------------------------------Behavioral and Attitudinal Predictors of Academic Success Scale
(BAPASS) (Wilkie & Redondo)
This is an instrument designed to identify first-year students who are likely to be placed on
academic probation, which was developed by Carolyn Wilkie and Brian Redondo at the Indiana
University of Pennsylvania. This instrument consists of 48 items with four subscales: (a)
academic behaviors & motivators, (b) stressors, (c) goals, and (d) alcohol & parties. [For
information on this instrument, see the following reference: Wilkie, C., & Redondo, B. (1996).
Predictors of academic success and failure of first-year college students. Journal of The
Freshman Year Experience, 8(2), 17-32.]
≉
Download