A Blueprint for Campus Accountability Lessons from the Pace University Experience

advertisement
A Blueprint for Campus Accountability
Lessons from the Pace University Experience
A Blueprint for Campus Accountability
The ideas and text of this report are for public use.
Please feel free to download and distribute copies from
www.pace.edu/assessment.
Concept: David A. Caputo
Writer: Christopher T. Cory, ccory@pace.edu
Principal Adviser: Barbara Pennipede, bpennipede@pace.edu
Design: Paul Sternberg
Compiled with help from numerous faculty and staff members at Pace University.
Contents copyright © 2006 by Pace University. All rights reserved.
A Blueprint for Campus Accountability
Executive Summary
A Blueprint for Campus Accountability: Lessons from the Pace University Experience
1.This report addresses the way a large, multicampus metropolitan university is improving its
assessments of student learning in order to become more accountable for its academic results.
2.Assessment in higher education has drawn increasing interest following the “No Child Left Behind”
legislation for elementary and secondary schools. The federal Commission on the Future of Higher
Education made no secret of its sympathy for measuring higher education, perhaps with federallymandated tests that parents and prospective students could use to compare institutions.
3.In the introduction, Pace University’s President, David A. Caputo, says the federal role should be
n I nsisting that colleges and universities have self-assessment plans, and
n P
roviding financial incentives for developing appropriate assessment techniques that measure how
well each institution meets its own goals for its students.
4.By presenting a variety of techniques for measuring just a few of the diverse teaching and learning
activities in one institution, the report suggests the fallacy of imposing oversimplified measures on all.
5.The report focuses on methods to measure actual learning, not just the familiar inputs that often are
used as proxies for it (size of the endowment, number of books in the library). Proxies are at the heart
of rankings like those of U.S.News & World Report magazine.
6.The report defines assessment as different from course-by-course testing because it weighs cumulative
results, gauges entire curriculums and programs, and probes value added: how well campuses build
on the knowledge, learning habits, and socioeconomic status each student starts with.
7.The report describes Pace University’s decision, in the forefront of a national trend, to create a
University-wide Assessment Committee of faculty and staff members, minimizing the imposition of
assessment from the top down.
8.The report recounts the University’s insistence that the primary goal of assessment should be the
improvement of teaching and learning.
9.A pillar of Pace’s self-assessment work has been its pioneering use of two newly-developed learning
measures. One, the National Survey of Student Engagement (NSSE), ascertains how much an
institution uses proven educational practices to engage students in academic work held to high
standards. The other, the Collegiate Learning Assessment (CLA), measures multidisciplinary abilities
like critical thinking, analytic reasoning, and written communication.
10.The report also presents some of Pace’s other self-assessment techniques, including faculty and
student “portfolios” and the use of outside academic reviewers.
11.The report outlines two significant results for students: Increases in the active learning that the NSSE
assesses, and improved analytic skills and understanding of connections across subjects.
12.The report makes clear that self-assessment is an active and ongoing process, yielding lessons that
need communicating to faculty and staff members, students, and the community at large.
A Blueprint for Campus Accountability
Contents
Introduction: Monitoring the Mosaic by David A. Caputo. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
I. Inner- and Outer- Directed Motivations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
II. Introductory Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Self-Assessment Ingredients. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
III. M
easuring Active Learning and Comprehensive Thinking. . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
What It’s Like to Take the NSSE. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
What It’s Like to Take the CLA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
The Six Pillars of Assessing Core Courses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
IV. Other Markers of Engaged, Connected Learning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
The Biology Department Finds Good News about Analytical Thinking. . . . . . . . . . . . . . . . . . . 19
Dickens? Wolfe? No, But Not a Bad Start for a Business Memo. . . . . . . . . . . . . . . . . . . . . . . . . . 20
V. Beyond Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
VI. Traditional Inputs and Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
VII. Next Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
VIII. Lessons for the Future . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Appendix: Other Self-Assessment Techniques at Pace University . . . . . . . . . . . . . . . . . . . . . . . . . 30
A Blueprint for Campus Accountability
Introduction: Monitoring the Mosaic
This report shows how one large, private, multicampus metropolitan university has taken the initiative
to improve the way it assesses student learning so it can better hold itself accountable to its students and
supporters for its academic results.
Assessment is increasingly “hot” amid the emphasis on measurement via “high-stakes testing” that
is mandated for elementary and secondary schools by the federal “No Child Left Behind” legislation.
Assessment is on the minds not only of responsible leaders in higher education, who have given it increasing
research attention for the last two decades, but also of parents, college students, the general public, elected
officials, and the federal Commission on the Future of Higher Education, scheduled to report in late 2006.
Pace University, with more than 14,000 undergraduate and graduate students and 1,130 full-time and
adjunct faculty members, is celebrating its 100th anniversary during 2006. Its multicultural enrollment
reflects all ages, colors, and cultures. (Undergraduates are approximately 52 percent non-Hispanic white, 13
percent Hispanic, 12 percent African-American, 12 percent Asian, 6 percent international students, and 5
percent “other.”) Pace students reflect diverse levels of preparation and experience, studying full and part time
on campuses in both New York City and suburban Westchester County.
In short, Pace students represent a rough cross section of the mainstream of higher education in the United
States today.
Measurement is far from the only ingredient in high quality education at Pace. Learning also takes academic
freedom for professors with keen, original minds and the skill and heart to communicate their knowledge.
Recruiting, developing, and retaining them remains one of Pace University’s glories, though one of the
hardest to quantify.
In addition, over the last five years Pace has steadily supplemented tests and final exams, which measure
students one by one and course by course, with a mix of other techniques to gauge individual students’
learning across courses and look at the overall impact the University has on groups of students. While such
self-assessment has come more slowly to some parts of the University than others, our experience has shown
that it helps keep us accountable to ourselves and others for meeting students’ needs as well as the needs of
the larger society for a trained workforce and new ideas. We believe such accountability is essential as the size
and cost—and value—of higher education grow to meet the demands of a global economy.
The best lesson from our self-assessment efforts has been that self-assessment is not a vehicle for keeping
score, but for getting better.
We also have learned that:
n M
eaningful measurement is not inexpensive.
n S
elf-assessment is not easily understood—or mandated.
A Blueprint for Campus Accountability
n
n
A
ssessment has to be spurred from within by a desire to improve outcomes for students, not imposed from
without.
A
ssessment must take into account the special values and aspirations for students that are held by each
component of the institution, particularly its academic disciplines.
The last point illuminates the growing United States discussion of national testing. For higher education,
uniform national testing would be neither practical nor desirable. Among other things, it would hurt
students who are most at risk because their progress would be unfairly compared with that of students who
have greater cultural or educational advantages. Given the great diversity of higher education institutions in
this country, a one-size-fits-all approach would not yield meaningful results.
A better approach is for institutions in the mosaic of U.S. higher education to be judged by their particular
goals for their students, whether they have comprehensive self-assessment plans to track their progress, and
what those self-assessments say about the results. The federal government should:
n Insist that colleges and universities have an ongoing self-assessment process;
n P
rovide financial incentives for developing evaluation tools, to be selected institution by institution, that
measure how well each institution meets its goals for the students it matriculates; and
n E
ncourage measurements of how well institutions improve (add value to) their students’ educational
achievements.
To encourage local variations and experimentation, federal funds should be administered by the states.
Results should be used to guide improvements.
These measurements will further empower people’s ability to make informed choices, which is the very
essence of allowing the free marketplace to operate.
For students and parents choosing colleges, and for individuals and governments supporting them, the sanest
quest has never been for some mythic campus that is best for everyone, but for the one that is best for the
individual student. Making matches one by one takes more work than doing it by standardized formula, but
having a wide range of choices is a lot like making a match for marriage—slightly unpredictable, but with
higher odds of satisfaction in the long run. Few of us want assembly-line spouses—or alma maters.
We hope this Pace University case study will be useful to all those with a concern for the success of students
in the colleges and universities of today and tomorrow.
David A. Caputo
President, Pace University
A Blueprint for Campus Accountability
I. Inner- and Outer-Directed Motivations
“Writing some 275 years ago in The Spectator, Joseph Addison observed,
‘Numbers are so much the measure of everything that is valuable
that it is not possible to demonstrate the success of any action,
or the prudence of any undertaking, without them.’ ”
—Peter T. Ewell
“In America, if you can’t count it, it doesn’t count.”
—Daniel Patrick Moynihan
“Everything that can be counted does not necessarily count;
everything that counts cannot necessarily be counted.”
—Albert Einstein
“Trying to measure what will influence a young person’s life
is like trying to predict the height of a redwood from the diameter
of its sapling, using a rubber band for a ruler.”
—Educational Testing Service official
Pace University got off to an early start on academic self-evaluation in the 1990s and believes it is now one
of the leading United States institutions of higher education in the comprehensiveness and scope of its
self-assessment processes.
Almost all U.S. campuses make some efforts to assess their teaching and learning, but few apply as many
techniques to as many parts of the university as Pace, starting at the top with assessments of the president
and senior officers. After a review of assessment plans at many of its member campuses, the National
Association of Independent Colleges and Universities wrote on its Web site that “Many schools seem to
have an interest in accountability and/or assessment but tend to focus on one or two topics like financial
assessment, student disclosure, … innovations in college pricing, campus cost control, and collaboration
initiatives.” “Few faculty as yet practice assessment as a part of their everyday work,” according to an essay
by Peter T. Ewell, a political scientist who is vice president of the National Center for Higher Education
Management Systems, in the 2002 book, Building a Scholarship of Assessment, edited by Trudy Banta.
By contrast, as of the spring of 2006, four out of Pace University’s five undergraduate schools had working
self-assessment plans and the fifth was moving toward adopting one. Nearly 70 percent of the academic
departments had plans, as did almost half of the administrative departments, where self-assessment plans
relate their work to supporting “the learning and teaching environment.” The rest of the University’s
departments had plans in various stages of development. (The University’s self-assessment Web site is at
www.pace.edu/assessment.)
“Assessment has become a word in everyone’s vocabulary,” even if campus skeptics do not always find
the connotations positive, according to Barbara Pennipede, who directs self-assessment activities for the
A Blueprint for Campus Accountability
University. After a month at Pace, a newly-hired associate provost commented that one of the first things he
noticed when he arrived was the University’s “ethos of assessment.”
As a result, in the last few years Pace University faculty members and administrators have been invited to
make presentations on accountability and self-assessment at conferences including those of the College
Board, the American Association for Higher Education, the Association of American Colleges and
Universities, the Association for Institutional Research, the National Assessment Institute, the Middle States
Commission on Higher Education, and the Mid-Atlantic Association of College Business Administrators.
Seeds of change
Assessment is as old as human judgments of other humans… and final exams.
By and large, however, until the last 20 years higher education has been measured not by its outputs
—how much it changes its students—but by inputs like the number of books in the library, the quality
of students who are accepted, the presence of new campus buildings, the comparative salaries of
professors, and the proportion of them who have achieved the final degree in their field, such as a PhD.
The connection between inputs and outputs either was taken for granted (after all, don’t Ivy League
institutions have a lot of graduates who became President?), or educators threw up their hands at the
difficulties of separating the results of education from all the other influences on human beings.
The current trend is different because it uses newly sophisticated techniques for diagnosing what learning
really goes on and how to improve it.
“Most campuses still do assessment because somebody tells them to,” according to Ewell. Pace University’s
President Caputo, a social scientist with a PhD in political science from Yale, became convinced about
the value of assessing teaching during his 26 years as a professor of political science at Purdue University.
Using tests at midpoints in his courses as well as finals, he recalls that as a young instructor “several times I
learned that something I was doing didn’t work and was able to change it before the class ended.”
By the time he came to Pace in 2000, after five years as president of Hunter College of the City University
of New York, he had found many reasons for disenchantment with the traditional inputs model. He had
developed a conviction that student learning had to be improved throughout higher education and an
increasing concern that ill-conceived tests might be imposed by federal or state governments. His ideas
were strengthened further by service as cochair of the New York State Regents’ Professional Standards and
Practices Board, which establishes benchmarks for the education of elementary and secondary teachers,
and as a participant in the early development of the Collegiate Learning Assessment (see page 15).
Promising seedbed
Caputo’s ideas about self-assessment went hand in hand with significant trends in business and nonprofit
management toward program evaluation, scientific management, and systems thinking. A key concept for
education was the idea of “value added,” notably promoted by Alexander Astin, a scholar of the field at
A Blueprint for Campus Accountability
UCLA. Since incoming student ability has been found to be the largest predictor of any student outcome,
this concept argues that however difficult it is in practice, institutions should be judged not by what Ewell
calls “traditional markers like resources and reputation” but by what they do for the students they recruit
after the students arrive.
At the same time, educational research was making professors and college administrators question traditional
college teaching methods, especially the lecture, which came to be mocked as “the sage on the stage.” The
research findings showed that active student participation helps learning more than just hours spent in class.
As Ewell writes, the last two decades have seen a revolution in undergraduate instruction, one that is
“fundamentally altering the shape and content of undergraduate study.” The change, he says, stems from
technology and from other teaching movements like “writing across the curriculum,” “learning communities,”
“problem-based learning,” and “service learning”—all of them involving seminars, group work, and projects,
often with practical outcomes.
Pace University had been attuned to pedagogical innovation since its birth in 1906 as a pioneer in systematic
ways of teaching accounting. Moreover, Caputo was able to work with a team sympathetic both to change and
to ways of weighing it.
n
n
n
rovost and Executive Vice President for Academic Affairs Joseph Morreale, previously head of the
P
University’s office of research, assessment, and academic support, is nationally recognized in the field
of university assessment. It was he who developed the University-wide assessment program at Pace,
establishing the University assessment committee and creating the University’s annual “scorecard” (see page
26). Most recently he coauthored Post-Tenure Faculty Review and Renewal III, an American Association for
Higher Education report on how nine anonymous institutions handle tenured faculty members, based on
more than 400 interviews and questionnaire responses from 1,600 faculty members.
B
arbara Pennipede, assistant vice president for planning, assessment, research, and academic budgeting, has
more than 15 years of experience with assessment testing and analysis at Pace and at St. Peter’s College in
Englewood Cliffs, New Jersey. She is a frequent peer evaluator on Middle States Commission accreditation
teams that assess assessment at other campuses.
P
eter Seldin, a faculty member at the University’s Lubin School of Business, is the national leader of the
exploding “teaching portfolio” movement (see page 22). In the portfolio process, professors work with noncompetitive mentors from other departments to evaluate portfolios of their teaching and find areas for
improvement.
By the time Caputo arrived at Pace, a critical mass of faculty members, including several influential faculty
leaders, was already involved in improving teaching and learning. A Center for Instructional Technologies had
been established in 1997 at about the same time a Center for Faculty Development was set up, both operated
by faculty members compensated through released time or a stipend. The centers were well launched on
their present program of finding promising teaching resources and sources of information, and of organizing
seminars and other sessions, many of which emphasize self-assessment.
A Blueprint for Campus Accountability
Public pressures
Pace’s leaders were well aware of a national interest in stronger assessments for higher education. In recent
years, parents, students, taxpayers, government officials, and private donors have shown increasing concern
about college costs and increasing compassion for the financial burdens on students. The people paying for
education want to make sure they get their money’s worth, and campuses have looked for ways to show donors
and legislators they are efficiently spending donations and appropriations. Meanwhile, competition for those
resources has grown from other areas of society like older people and health care.
Added to these prods is a competitive higher education marketplace in which colleges and universities face
rivals in other countries and among stripped-down profit-making institutions. Institutions determined to
keep high prices from impeding access have turned to sophisticated management measurements to help
them hold down costs. And colleges and universities have been trying to find substitutes for what they
deem superficial measurements of their work, ranging from the much-criticized formulas of U.S.News &
World Report magazine to the casual anecdotes and gossip on Web sites like RateMyProfessor.com. These
issues have been refracted though the debates in Congress during 2005 and 2006 over reauthorizing the
federal Higher Education Act.
But as the external pressures became more apparent, Pace University already had begun to move.
A Blueprint for Campus Accountability
II. Introductory Steps
Self-assessment was strongly built in to Pace University’s strategic plan, Reaching New Heights, which the
trustees adopted in 2003.
The plan’s core objectives frequently specify measurable outcomes.
n T
o “strengthen Pace’s academic excellence and reputation,” for instance, the plan specifies “four new
faculty positions will be added each year with a total of 20 by 2008.”
n To “reinforce Pace’s commitment to being student-centered,” the plan specifies improvements like
“increase … endowment funds available for scholarships… by one percent per year.”
n To
“strengthen Pace’s financial situation,” the plan calls for such measures as increasing the endowment
by $45 million by 2008.
n T
o “celebrate the University’s Centennial,” the plan urges activities to strengthen Pace’s “standing and
endowment.”
And the plan says the University will “establish our accountability and evaluation efforts across the
University as a national model.”
The objectives for strengthening the financial situation and celebrating the Centennial do not play major
roles in this report, since they are means to academic excellence but do not measure it. To be sure, they are
essential, and Pace assiduously measures many of their components. (See appendix.)
Introduction tactics
Much of the progress on self-assessment at Pace has come through encouragement, not enforcement.
Barbara Pennipede, who has taken on the lion’s share of implementation, loses no opportunity to remind
faculty and staff members that: “Assessment comes from the Latin, ad+sedere, to sit beside,” and to
insist that the emphasis be on student improvement, not faculty evaluation. Speaking of portfolio selfevaluation by faculty members, Sandra Flank, a professor of education and former coordinator of the
University’s Center for Faculty Development, says the initiative “came from the faculty but was supported
by the administration.”
Of course, encouragements for assessment at Pace have sharper teeth as well. For example, it is harder for
faculty or staff members to dismiss a problem as atypical or a one-year fluke if data show it is widespread
and recurs from year to year.
Moreover, planning and progress toward goals of the strategic plan are keys to individual evaluations on
which raises are based. University committees, most of which include faculty members, recommend very
few proposals that do not further the goals of the strategic plan and contain some idea of how success
will be measured. In two separate forms, requests for new funds must specify which plan objectives the
requests will help meet and how. Looming in the background are increasingly specific demands from
accrediting agencies for self-assessments and the knowledge that loss of accreditation could hurt the
lifeblood of any University—enrollment.
A Blueprint for Campus Accountability
Though some goals of the Pace strategic plan are broad enough so it is fairly easy to claim that proposals
advance them, at the very least the need to go through the plan reinforces thinking about priorities. To
improve academics, for instance, upgrades for accounting labs recently got priority treatment. So did
streamlined student services, a major step toward being student centered. But new furniture for some
offices and political poll-taking in foreign languages did not.
Whether a department has an internal strategic plan, with self-assessment components, is weighed when
departments’ requests for new or replacement faculty members are considered.
Starting at the top
The president of the University and the chair of its trustees, Aniello Bianco, set an early self-assessment
example by insisting on evaluations of trustees and senior officers, including themselves. Each trustee
knows he or she will be judged on attendance at board meetings and at University-wide events,
participation in committees and conferences on University business, and dollars raised. At the president’s
request, senior officers must set and report progress on 6-, 12-, and 18-month goals.
More striking, in 2003 Pace became one of the first universities to invite, every spring, all staff and faculty
members to rate the president, every officer at the vice presidential-level or above, and every academic
dean. The rating considerations, worked out with a consultant, are well-accepted attributes of good
leadership like articulating a mission, respecting differences, and leading on sensitive issues. “The use of
an outside vendor made the assessment more acceptable because a third party does it,” Provost Morreale
says. Electronic privacy safeguards encourage candor and the results are sent directly to the consultant for
tabulating. The project cost $30,000 to start and takes $18,000 a year to repeat.
Aggregate results (though not individual scores) are posted on the internal Web site for all to see. Each
person rated gets his or her own data, plus open-ended comments. (Comments to Caputo have ranged
from a grumpy “leave the University” to encouragement for pushing an academic integrity code and
suggestions for more computer training and outreach to students.)
In the early years, administrators got their lowest marks for personal flexibility. This is important in a
university that has set a core objective, as Pace has, of becoming more student-centered, and the survey
corroborated nagging worries. As a result, the president made flexibility a priority in his annual performance
reviews with his direct reports. While the cause and effect is hard to prove, flexibility ratings have since gone
up. Another finding has been that administrators are viewed most positively by those who know them best,
which has led to attempts to increase leaders’ interactions with multiple groups around campus.
The rating technique is less probing than some forms of executive evaluation used in the corporate world,
but inviting this kind of frankness is a major step in an academic culture that in theory and often in fact is
based on collegiality.
Going beyond ratings, in 2005 Caputo insisted on more comprehensive 360-degree feedback sessions on his
performance with every officer directly reporting to him, the results of which he shared with the trustees.
The payoff of leadership evaluation for students has been shown by studies at various locations confirming
that improved work by senior staff members increases productivity for all.
A Blueprint for Campus Accountability

The committee, the plans, and the ineffables
As these measures were being developed, in the spring of 2002 the president encouraged Morreale,
then vice president for planning, assessment, research, and academic support, to create a UniversityWide Assessment Committee, cochaired by a faculty member and Pennipede. Though charged with
tracking progress under the strategic plan, its primary focus was academic. It included staff members,
administrators and students, but was largely made up of faculty members from all of Pace’s six schools.
The committee has shaped and taken “ownership” of many self-assessment processes at Pace, and the
committee, not just administrators, reports back to faculty meetings.
In the fall of 2004, the committee and administration announced a bold requirement: That every academic
and staff department would have to prepare a self-assessment plan. The committee and the assessment
office collaborated on multiple presentations of what would be involved and offered copious help and
consultations by committee members, Pennipede, and her staff.
The heart of the request was a simple and permissive template with places for specifying one or two outcomes,
and suggesting the use of timetables, measurements, and responsibilities assigned to specific people.
Self-Assessment Ingredients
Requirements for the self-assessment plans that all departments at Pace must prepare are far from rigid. Derived from the “template” originally developed
by the Assessment Committee, the ingredients now are listed in the following follow-up questionnaire that the committee sends out once a year.
FOLLOW-UP QUESTIONNAIRE
As an effort to establish a “climate of assessment,” the University-Wide Assessment Committee is interested in the progress you are making.
While this is primarily an internal process for your department or unit that should be self-motivated and with results kept for your
internal use, we would appreciate being kept apprised of your progress. Please send to bpennipede@pace.edu as an attachment.
DEPARTMENT OR UNIT
OUTCOME
1. This outcome is:
r Fully met
r Ongoing
r More than 50%
r Less than 50% 2. Who was affected by this outcome: r Faculty r Staff r Students r Administration 3. How did you measure this outcome: r Collected data
r Anecdotal r Other (Explain)
4. The timetable for this outcome was
Has been met r Yes r No If No, should be extended to
5. Work on this outcome was completed by
r Chair
r Faculty committee r Staff r Students r Other
If you choose, attach your evidence that this outcome has been met
E
L
P
M
SA
r Not met
r Other (explain)
r Administration
How will you use this finding or evidence to further improve your department or unit?
Based on the above, what new goals will you develop and what outcomes will you establish?
What assessment instrument(s) will you use to measure this/these new outcomes?
Obstacles encountered and how we can be of help:
Submitted by
Date
Please feel free to call on the University-Wide Assessment Committee for any assistance you need or visit our Web site on the
Pace home page (search under A for Assessment).
A Blueprint for Campus Accountability

Since plans were requested, the administration has been sensitive to the fact that departments are at
different stages of readiness to plan and have varying objectives. Schools like nursing and departments
like accounting, used to preparing students for external certification exams in their professions, had more
familiarity with measurement techniques and more techniques available to them than those teaching
more theoretical subjects and the liberal arts and sciences.
The leaders of the self-assessment effort candidly asked for help in measuring what are known in
assessment circles as “the ineffables”—characteristics such as leadership and ethics. At Pace, the committee
firmly declared that:
Learning includes students growing in knowledge, in the application of that knowledge in their work
and in meeting life’s other challenges, and in the continuous development of sensitivities, values, and the
capacity to act in ways that benefit those immediately around them and, ultimately, all people. Because
learning is a process of lifelong growth, learning at Pace can be seen as making a contribution to a life
pattern that began before students became members of the Pace community and will continue long after
they have moved on.
At the start, many departments felt imposed upon and worried about the costs in time and dollars.
Though questions remain, some of that feeling has eased, according to Morreale, the provost, because
of the focus on student learning and improvement instead of faculty evaluation, the acknowledgement
that departments have different kinds of goals for their students, and the freedom the template allows for
departments to devise their own systems. In addition, the University set up a series of town meetings and
Q&A sessions to answer questions and dispel concerns.
Perhaps most important, the University provided significant resources. Presidential grants from Caputo
supported faculty and student assessment using portfolios (see page 22), a handbook to help faculty make
the most of classes organized as “learning communities” (page 17), and the development of specialized
assessment tools. These grants continue to support further self-assessment projects. Funding was
continued for the Center for Faculty Development and the Center for Teaching, Learning, and Technology.
The centers now collaborate on a three-day Faculty Institute on teaching and self-assessment held each
spring after the end of classes.
A Blueprint for Campus Accountability

III: Measuring Active Learning
and Comprehensive Thinking
Pace students who visit the University’s assessment Web site learn that before they graduate they will
participate in at least 12 institution-wide assessment activities.
One of the most significant is the National Survey of Student Engagement (NSSE), which measures how
much students are engaged in active teaching techniques that promote their engagement. The other selfassessment tool is the Collegiate Learning Assessment (CLA), which probes students’ ability to think about
complex situations. (See boxes, pages 14 and 16.)
These tests are beginning to suggest that parents and students have other ways to size up the value of
colleges and universities than the largely input-based rankings of U.S.News & World Report and its ilk.
The two instruments try to measure aspects of learning that often have been only guessed at in the past –
how well the University uses techniques for engaged learning across all disciplines that research has shown
produce high levels of success, and how well the University inculcates higher order skills of synthesis and
analysis that are useful in all human activities, regardless of one’s specialty.
Developed by nonprofit research institutions and assessment specialists with early and enthusiastic Pace
participation, these instruments are “low-stakes” but “high value.” The stakes are low for students because
institutions use aggregate results to assess internal issues, not to make decisions about individuals. The
stakes are only somewhat higher for teachers, who find diagnoses — also in the aggregate — of factors that
determine or demonstrate their success. The instruments have high value because they illuminate areas
needing change far more specifically than generalized calls for better professors or more funding. (Some of
the other instruments used at Pace are listed in the appendix.)
The National Survey of Student Engagement: Good learning is active
The spring of 2006 marked the fifth consecutive year of Pace participation in the six-year-old, nationwide
instrument known as NSSE or “Nessie.” It measures good educational practices, validated by years of
research, that can be viewed as proxies for high-quality learning, but that are much more tightly linked to
it than the size of the endowment. The research has shown that students learn most when they find their
academic work challenging, actively collaborate on learning activities, and interact with faculty members
and diverse peers inside and outside the classroom. Working with others in solving problems or mastering
difficult materials, says a brochure from the center that administers the surveys, prepares students to deal
with “the messy, unscripted problems they will encounter daily, during and after college.”
Both paper and Web questionnaires are sent to random samples of first-year and senior students at
participating colleges and universities by the Center for Post-Secondary Education, based at the University
of Indiana at Bloomington. NSSE is funded by foundations, the federal Fund for the Improvement of
Post Secondary Education, and fees from users. At Pace, it is supplemented by two parallel surveys — the
Faculty Survey of Student Engagement, which asks faculty members to reflect on the importance they
place on key educational practices, and the Law School Survey of Student Engagement, which provides
insights into law school educational practices and environments.
A Blueprint for Campus Accountability

What It’s Like to Take the NSSE: Reporting the Academic Activities that Really Give You an Education
Taking students about 15 minutes, the National Survey of Student Engagement (NSSE) asks students to report
on formal and informal academic matters that are the heart of memorable learning. For instance, it asks
students how much they:
n
n
n
n
n
n
n
n
n
n
n
W
rite papers longer than five pages
Take
courses that emphasize synthesizing and organizing ideas
D
iscuss ideas from reading and classes with faculty members outside of class
T
alk about career plans with a faculty member or adviser
W
ork with a faculty member on a research project
Work
with other students on class projects
Talk
with students from different races or ethnicities or who hold different religious beliefs and political
opinions
P
articipate in internships or field experiences and community service work
U
se electronic technology to discuss or complete assignments
P
articipate in cocurricular activities
P
articipate in a culminating senior experience.
Students also rate factors like
n T
he quality of their relationships with other students, faculty members, and administrative and office
personnel, and
n W
hether the campus environment encourages spending significant time on studying and academic
work.
The NSSE survey of first-year students and seniors at Pace in 2004 produced good news and bad. (The
2005 administration was done by Pace rather than the Center for Post-Secondary Education in order
to provide separate results for each of Pace’s schools. Tabulation was nearing completion by late spring
of 2006.) Compared with previous years, the 2004 survey showed that Pace seniors had increased their
frequency of class presentations, work with classmates, use of electronic technology, internships and field
experiences, foreign language course work, and cocurricular activities. First year students reported strong
levels of some of these activities, and of working with faculty members on activities other than course
work, participating in a community-based project as part of a regular course and in community service/
volunteer work. Significant numbers also said the institution “helps you cope with your non-academic
responsibilities.”
In other good news, the NSSE found large numbers of students having serious conversations with fellow
students of a different race or ethnicity. This finding corroborated one from a survey Pace created and
administered during the 2003–2004 academic year. Asked to say if they agreed or disagreed with the
statement “I feel I need to hide some characteristics of my racial and ethnic culture in order to fit in at
Pace,” only 10 percent of both undergraduate and graduate students agreed.
A Blueprint for Campus Accountability

On the other hand, the NSSE showed seniors seeming to lag behind their peers in comparable institutions
on their number of written assignments over five pages, how often they got prompt feedback from faculty
members on their academic performance, and how highly they rated the quality of their relationships with
other students. First-year students reported somewhat low levels of preparing for class and of course work
emphasizing the application of theories and concepts.
Results like these have provided support for a number of initiatives to increase active learning, some of
them already getting underway as use of NSSE began. They have included first-year learning communities,
increased faculty emphasis on different modes of learning, and greater development of senior capstone
experiences.
To increase civic engagement, for example, in 2003 the University became one of 10 campus founders of
Project Pericles, a consortium funded by the philanthropist Eugene Lang to start students on a lifelong
commitment to participating in democratic processes. At Pace, Periclean activities have gone hand in hand
with a core requirement that students take at least one course with a component of community service
and a chance to analyze and reflect in class on why the service is needed. Students now serve in and analyze
a myriad of institutions from museums to social service agencies, governments, and nonprofit media
organizations.
Not surprisingly, the number of courses with civic engagement components increased from 10 the first
year to more than 40 in 2005–2006. Project Pericles also coordinates cocurricular civic programs open
to the entire Pace community, many along thematic lines (Democracy in Action, Political Action Week
for Human Rights, Global Citizenship Week). In the same period, the number of people attending events
sponsored with help from the project rose from 228 to more than 1,600.
NSSE results showed that students did not report talking much with faculty members or advisers about
career plans and that only slightly more than half credited Pace with a “supportive campus environment.”
Even allowing for some students’ cynicism this was troubling, because research shows that interaction with
faculty members not only promotes learning, but is the single most important factor in retaining students.
Among many venues, the results were presented at a one-day faculty conference. As one result, conference
participants report that they now are increasing the pizza parties and cups of coffee they have with
students, and more important, are more often involving students in research projects and maintaining
greater contact with their first-year advisees. In one upper-level marketing course, the professor ramped
up her occasional reviews of seniors’ résumés, making a one-on-one session part of the course for every
student.
The Collegiate Learning Assessment: Good learning connects the dots
The second innovative, nationwide evaluation tool getting serious attention at Pace, the Collegiate
Learning Assessment, also was developed by the Council for Aid to Education with the RAND
Corporation. It is one of the first standardized measures to gauge college-level critical thinking, analytic
reasoning and written communication. Pace participated in the pilot phases of this assessment in 2002 and
in its first official administration for freshmen and juniors in the fall of 2004 and for seniors in the spring
of 2005.
A Blueprint for Campus Accountability

What It’s Like to Take the CLA: Would You Buy this Airplane… or this Argument?
The Collegiate Learning Assessment takes 90 minutes on a computer. It has two kinds of exercises. The first is a
“real life” activity like preparing a memo or policy recommendation. An illustrative example:
You advise Pat Williams, the president of DynaTech, a company that makes precision electronic
instruments and navigational equipment. Sally Evans, a member of DynaTech’s sales force, recommended
that DynaTech buy a small private plane (a SwiftAir 235) that she and other members of the sales force
could use to visit customers. Pat was about to approve the purchase when there was an accident involving a
SwiftAir 235. Your document library [on the test-taker’s computer] contains the following materials:
1. Newspaper article about the accident
2. Federal Accident Report on in-flight breakups in single-engine planes
3. Internal Correspondence (Pat’s e-mail to you and Sally’s e-mail to Pat)
4. Charts relating to SwiftAir’s performance characteristics
5. Excerpt from magazine article comparing SwiftAir 235 to similar planes
6. Pictures and descriptions of SwiftAir models 180 and 235.
Sample questions: do the available data tend to support or refute the claim that the type of wing on the
SwiftAir 235 leads to more in-flight breakups? What is the basis for your conclusion? What other factors
might have contributed to the accident and should be taken into account? What is your preliminary
recommendation about whether or not DynaTech should buy the plane and what is the basis for this
recommendation?
The test preparers say these kinds of exercises “require students to marshal evidence from different sources;
distinguish rational from emotional arguments and fact from opinion; understand data in tables and figures; deal
with inadequate, ambiguous and/or conflicting information; spot deception and holes in the arguments made by
others; recognize information that is and is not relevant to the task at hand; identify additional information that
would help to resolve issues; and weigh, organize, and synthesize information from several sources.”
A second CLA exercise involves analytic writing, asking students to make an argument (for example, why
they agree or disagree that “there is no such thing as ‘truth’ in the media. The one true thing about the
information media is that it exists only to entertain.”). Other exercises ask for a critique of an argument,
such as the justification given by the principal of an elementary school where students are obese for his
opposition to fast food restaurants nearby.
These exercises “measure a student’s ability to articulate complex ideas, examine claims and evidence, support
ideas with relevant reasons and examples, sustain a coherent discussion, and use standard written English.”
Interim reports from early trials of the CLA are still awaiting a meeting of researchers from participating
campuses in the summer of 2006 that will sort out results and how to interpret them. But like NSSE, the
CLA is likely to boost the momentum behind another significant movement at Pace — encouraging a more
coherent, less fragmented college experience in which students evaluate information, apply it to complex
A Blueprint for Campus Accountability

problems, and “connect the dots” between ideas. As time goes on, many of these programs are expected
to produce improved scores on the CLA. Two examples of programs for which the CLA should provide
feedback include learning communities and a recently installed core curriculum.
Learning communities. Coherent learning at Pace is embodied in a student’s early years in
multidisciplinary courses where paired professors teach in “learning communities,” groups of students
who take both subjects together and get to know one another.
The Pace faculty made learning community courses a central part of the required core curriculum in the
fall of 2003 after a pilot test in 2002–2003. Students are strongly encouraged to take at least one learning
community in their first year.
The courses’ intellectual sparkle often makes adults wish they could come back to college and sign up.
“Writing, Computing and the Human Condition” is based on the insight that “Like writing, computing
is as much about structure as it is about creative and critical thinking”; “The Hudson River and the
American Tide,” with field trips on the river and to historical places on the banks, covers the river as an
economic corridor, the birthplace of artistic movements, and a subject of ecological policy debates. Some
course titles are self-explanatory and highly topical, like “Scandal in the Boardroom: Business, Ethics and
Information Technology.”
In addition to improved learning, students who take learning community courses their first semester
come back at a rate nearly six percent higher than students who do not. Student comments, collected by
evaluation questionnaires, help explain why. One student wrote: “Other classes are simply memorization
and busy work.” Said another: “When a student takes five different courses and has to switch his or her
frame of thought often to accommodate the subjects, it is nice to know that two of those classes have
something to do with each other. It is much easier to focus on the class when there’s a common theme.”
Initially, these courses were driven by promising experiments elsewhere and faculty members’ personal
belief in them. Although the entering class of 2003, the first to experience them, was only finishing its
third year by the spring of 2006, the value of learning communities has been tentatively borne out by early
self-assessments that Pace faculty members developed. Thus spurred, the number of learning community
courses grew from 15 to more than 100 and annual enrollment increased from 200 to more than 1,200
in the 2005–2006 academic year. Pace, a sponsoring member of the Atlantic Conference on Learning
Communities, is hosting the conference’s Learning Community Open House, with workshops for nearly
100 educators on topics like developing learning communities and, of course, assessing them.
Core curriculum. Self-assessment of the trend toward more integrated learning broadened in the winter
of 2006 as a Learning Assessment Grant from President Caputo’s office provided small stipends, materials
and meals at meetings for 15 Assessment Fellows from the faculty of the University’s Dyson College of
Arts and Sciences, where many of the learning community courses are taught. A respected faculty leader
receives a one-course reduction in her required teaching load to work with the fellows on identifying
outcomes and assessment tools (see box, next page) that move beyond the learning communities and try
to measure three of the 12 skills specified as goals of the University’s Core Curriculum — communication,
analysis, and civic engagement.
A Blueprint for Campus Accountability

The Six Pillars of Assessing Core Courses
The faculty members who are Pace’s Assessment Fellows have developed and are now testing six basic questions
for evaluating courses in the University’s core curriculum. The questions apply to all disciplines. In this
example, the illustrative answers involve reference skills.
1.What general outcome are you seeking?
For example, students will be able to find secondary sources of information on one of the course
topics.
2.How would you know it (the outcome) if you saw it? What will the student know or be able to do?
For example, students will be able to locate and retrieve a scholarly journal article on a selected topic
using one of the library’s research databases.
3.How will you help the students learn it? (in class or out of class)
For example, students will be given an introduction to appropriate library research databases, and will
be instructed on how to create effective search statements.
4.How could you measure each of the desired behaviors listed in #2?
For example, students will submit an annotated bibliography of sources with their research project,
in which the students will evaluate each source on its usefulness for their purposes, and will describe
how they retrieved each item.
5. What are the assessment findings?
6. Based on assessment findings, what improvements might be made?
A Blueprint for Campus Accountability

IV. Other Markers of Engaged, Connected Learning
In addition to a core that stresses active and interdisciplinary learning, other academic areas at Pace have
increased their linkages between subject areas.
Multidisciplinary courses. Some new academic programs have cross-disciplinary titles — “Information
Assurance in the Criminal Justice System,” a minor in computer science for finance majors and a finance
minor for computer science majors, an MS in financial computing, and a certificate in Latin American
Studies. Other topics have married across disciplines without changing their names. Not counting learning
communities, between 2002–2003 and 2005 – 2006 interdisciplinary courses grew from 16 to 90.
Department-wide evaluations. In the departments where self-evaluations are most advanced, professors
have found themselves working harder to make sure the courses in a major relate to each other and
give students a comprehensive view of their field as well as the more familiar detailed introductions to
specific areas. Systematically comparing notes while preparing the self-assessment plan, computer science
professors discovered that different sections of basic courses were defining key terms differently, so upperlevel instructors could not be sure what had been covered. The result was common definitions and more
uniform instruction.
The Biology Department Finds Good News about Analytical Thinking
Assessment has been particularly thorough for a new undergraduate biology curriculum developed by
Richard B. Schlesinger, the biologist who chairs the Department of Biology and Health Sciences, and his
colleagues. The new course of study increases emphasis on analytical and integrative skills. Previously,
majors were tested with traditional examinations that required memorization of facts. Now, exams in the
major biology core courses include more questions designed to encourage students to integrate concepts
across disciplines in biology, and between biology and related scientific fields.
As a check on how well such thinking is taking root, at the start of the junior year, students who finished
two years of biology core courses in the new curriculum were challenged with a standardized assessment
examination (the Educational Testing Service’s Major Field Test) that usually is used as an exit exam for
seniors who have finished the biology major. The test was given in the fall so the faculty could pinpoint
and address deficiencies in the students’ background knowledge as they began the specific advanced tracks
in the curriculum.
The results were encouraging. While confirming that some additional courses may be needed in areas
where students tended to be weak (the courses will be added over several semesters) and suggesting
improvements in advising, the results also showed that with three semesters still ahead, almost all the
students already were close to the national norm for seniors near graduation. Students in the new biology
curriculum performed better than those who were “grandfathered” to use the old curriculum.
Perhaps most encouraging, the examination clearly showed that in “analytical skills,” beginning juniors in
the new curriculum outperformed seniors who were finishing the old one.
A Blueprint for Campus Accountability

Intensified writing. The cross-disciplinary essential of good writing is now assessed more consistently and
taught more intensively than it has been in the past. Starting in the summer of 2006, all incoming students
take a writing placement test asking them to read a short paragraph and write a brief essay evaluating it.
Students are assigned to one of three introductory writing courses based on criteria the English faculty
has agreed on (see box, below). As the term goes on students are required to keep their written work,
including marked papers, in a portfolio. At the end of the term, they submit the portfolio with a “reflective
statement” on their growth as a writer and two of their essays, one of which must cite research. The
portfolio is designed to let both students and professors concretely see progress—and since, famously,
“writing is rewriting,” papers in the portfolio can get high marks for showing good revisions.
In other subjects, growing numbers of courses are now “writing enhanced,” with required essays. Here too,
Pace faculty members are beginning to agree on common criteria, sometimes called rubrics, for assessing
how students are doing and what kinds of additional writing instruction they may need.
Better writing also has been an unplanned consequence of courses in which professors and students
communicate online. Richard Velayo, a Pace psychologist, has published research showing that students
who communicated outside of class via a keyboard were significantly more likely than students who
attended class in person to organize their writing well and use good grammar.
Dickens? Wolfe? No, But Not a Bad Start for a Business Memo
Greatness in literature always will be judged subjectively, but the basics of good practical writing can be
specified. Here is what students are told about the writing placement exam that Pace is requiring of all
entering students starting in 2006.
Faculty will evaluate your exam for evidence of the following criteria:
n A
clear thesis (statement of your position or opinion) responding to one of two short statements or
questions
n Individual
paragraphs that are organized around a central idea, in support of your thesis
n G
ood use of evidence to develop individual paragraphs
n L
ogical placement of all paragraphs
n S
entences that flow smoothly and demonstrate a variety of patterns
n A
dherence to Standard American English use of words, punctuation, sentence structure, and grammar.
(Adapted from the University of Baltimore Academic Resource Center grading criteria archive for
writing assignments.)
Life planning. In recent years, the University has found itself losing an unacceptable number of students
between sophomore and junior years. Research studies elsewhere showed that sophomores can get
lost and become disengaged from college if they do not see a purpose or practical goal for themselves.
Combined with the NSSE data on students talking about career plans with faculty members or advisers,
this information led to a new course, being pilot tested in 2005 – 2006, to help students who are undecided
A Blueprint for Campus Accountability

about their majors do personal assessments of their possible career interests while being introduced to the
careers to which majors lead.
Senior capstone courses. An expansion of the idea behind the traditional senior thesis, these are projects
designed to integrate most of the learning objectives in a major. The number of students completing
capstones has grown steadily, to well over 70 percent of the senior class, and the University hopes soon to
make them universal.
In the Lubin School of Business, for example, capstone seminars now require skills from management,
marketing, accounting and finance, often in a team project. While surveys have found that some faculty
members’ opinions of capstone projects are mixed because of the undeniable extra supervision required,
professors also say capstone students understand connections between materials better than those without
capstone experiences.
A more concrete outcome is that several capstone projects have turned into Fulbright applications, helping
raise the number of Pace students who won the prestigious postgraduate fellowships for work and study
abroad to 16 since 2002. Fulbright fellows from Pace have gone to Nicaragua to investigate empowering
women through civic engagement, to the Czech Republic to look into ownership, control, and firm value
in emerging market privatizations, and to Canada to study the role of “RB” in cell differentiation.
A Blueprint for Campus Accountability

V: Beyond Data
Pace University is far from fixated on hard data for self-assessment. Other self-assessment techniques follow.
Teaching portfolios
In 1997 a professor of management at Pace’s Lubin School of Business, Peter Seldin, published The
Teaching Portfolio, a book that is widely credited with accelerating the use of portfolio techniques for
faculty self-evaluation. In 1999, Pace’s newly-created Center for Faculty Improvement took an early lead in
offering the technique to Pace professors.
Faculty members now sign up to work with a faculty mentor, from a department other than the one where
the participant teaches, for about half their time during the first week of summer vacation. The mentor
encourages the professor to follow a protocol for setting forth what he or she wants to teach, how it is
structured, and how the results are measured, and for assembling a portfolio of concrete examples and
results. Because the mentors do not influence mentees’ salaries or promotions, the process encourages
candid reflection.
At the very least, participants say the resulting insights have reduced the number of mutually-frustrating
conversations like:
Professor: “The test will be heavily based on the textbook.”
Student: “But you taught us material that’s completely different and didn’t assign those chapters.”
Beyond that, Seldin’s research shows that professors who put themselves through the process broaden their
range of teaching techniques. Their students consistently report higher levels of satisfaction with them
than with others, saying their professors are more engaged in teaching, more willing to experiment in class
and more interested in reflecting on what has gone well and what has not.
Participation is voluntary, and Pace professors who administer the program acknowledge that it may
not always reach the faculty members who most need it. Still, deans and department chairs encourage
attendance by faculty members who are up for tenure or having teaching problems. Professors know
that the descriptions of each candidate’s dossier that are published by the provost specifically mention a
candidate’s teaching portfolio if one has been submitted.
Since 2001 about 300 Pace University faculty members have participated in 15 portfolio workshops of 20
people each and the process has acquired “a certain cachet,” according to Linda Anstendig and Constance
Knapp, respectively former and current codirectors of the faculty development center. In 1990, Pace and
approximately 10 other campuses were using the technique; by the middle of 2006, Seldin reported it in
use at more than 2,000 institutions in the United States and many abroad.
Classroom assessments
At the other end of the complexity scale, “the muddiest point” and “the one-minute paper” are among
simple techniques that have been spreading at Pace to help professors gauge how they are doing week by
week. At the end of a day’s class, professors are asking students to describe the muddy point they had the
hardest time “getting,” or to take no more than a minute to summarize the most important lesson they
are taking away. Such techniques are summarized in a 1993 book, Classroom Assessment Techniques, by K.
Patricia Cross and Thomas Angelo.
A Blueprint for Campus Accountability

A Pace philosophy professor, Harold Brown, has logic and ethics students who want extra credit work
through the principles he is teaching by applying them to everyday occurrences and practices. They
illustrate their understanding by finding examples of arguments in editorials and comic strips, and Brown
modifies his subsequent teaching according to what they turn in.
Focus groups
In the fall of 2005, faculty and staff members sat down with students to discuss a perception that students
were confused about the 12 objectives of the University’s required core curriculum (including skills such
as effective citizenship, global, national, and international perspectives, valuing, aesthetic response and
information literacy and research). The students said they did not know the objectives and would like to;
some said they wanted to know which objectives were being sought in each of their classes. As a result,
improved methods for communicating the objectives are now being planned for the next semester.
Similar sessions including focus groups run by The Pace Poll, the University’s survey research organization,
have proved useful in exploring issues related to the campus climate for learning stressed by the NSSE as
well as more-practical issues like retention.
Benchmark comparisons: Eye-openers for eeyores
Before the start of Pace’s strategic planning process five years ago, University teams identified 12
universities to use as benchmarks for Pace’s progress (see list in appendix). Since then, formal and informal
comparisons have been made in numerous areas. For instance, Patrick Love, an associate provost assigned
to improve counseling and advising for students, took on the University’s inevitable nay-sayers with a
list of 20 features of Pace that he thought people could be proud of, ranging from learning communities
to the honors college, Model UN teams, a common hour for student activities, and online chats with
the president. He then hired two graduate students part-time to research how often a set of comparison
institutions, some from the benchmark list and others chosen as “competitors,” had similar features.
The result was an eye-opener for those with the cast of mind of Eeyore, the gloomy donkey in A.A. Milne’s
Winnie-the-Pooh: Love found that only three of the benchmark institutions seemed to offer even half of
the programs Pace provides. He since has presented his findings to multiple campus groups, using the
comparison to argue that the further changes he believes are needed have to start with the evidence that
“Pace is doing a lot of good work” and has a strong “ethos of contributing to the success of our students.”
Outside reviewers
Since 2001, Pace has brought in professors from other institutions to evaluate its academic departments.
Two outside reviewers spend at least one day in each department every five years, a cycle that annually
reaches 20 percent of the departments. The cost per department is approximately $3,000. Though some
reviewers have been criticized for being “too soft” and overly sympathetic to departmental thinking, at
their best they break the inevitable provincialism that sets in at departments everywhere, not just on
campuses.
A Blueprint for Campus Accountability

VI. Traditional Inputs and Outcomes
For all its measures of “outputs,” Pace still measures a number of inputs that traditionally are related to
learning, assembling them in close juxtaposition in several report formats. (A partial listing of its financial
metrics is in the appendix.) Recent reports have shown the following.
Students and professors. Average SAT scores rose from 1055 in 2002 – 2003 to 1081 in 2005 – 2006; students
in the honors college increased from 115 to 198; Pace became more selective as the percentage of accepted
students declined from 76 in 2001 to 70 in 2004; full-time professors teaching first year students increased,
and the University met or exceeded the strategic plan’s goal of four new faculty positions per year.
International Programs. The number of Pace students studying abroad tripled between 2001 – 2002
and 2004–2005, though so far they comprise only four percent of the student body. In a report to
the assessment office, the School of Law noted that it established a student exchange program with
two universities in Latin America’s largest country, Brazil, is cosponsoring a three day conference of
international lawyers and law professors discussing the enforcement of international environmental laws
(fall 2006), led the expansion of the world’s largest moot court competition (in Vienna) to include a threeday Hong Kong “Moot East,” and expanded its International Commercial Law Database.
Research centers and laboratories that intensified their international focus in the last four years included
those in nursing research, clinical practice and international affairs, formal computing methods,
information technology intelligent agents, pervasive computing, distance education, telecommunications
and learning, international business development, global finance, financial reporting in global markets,
securities markets, and autism.
One-stop student services. Several NSSE findings confirmed widespread student dissatisfaction with Pace’s
registrar and financial aid offices. This helped catalyze long-simmering ideas for merging the offices and
data systems and cross-training staff people so they can be more responsive.
Predictable costs. In 2003, Pace University pioneered two programs that help parents and students plan
their college finances. Pace guarantees that the tuition a student pays the first year will not change for up
to five years of attendance in good standing. The Pace Promise guarantees to make classes available so
students who pick a major in their sophomore year can graduate in four years. Three Pace undergraduate
classes are now paying guaranteed, level tuitions and the number of students has grown who are picking
majors promptly.
These cost-planning programs seemed to help increase retention, holding down recruiting costs and thus
helping hold down tuition increases. Preliminary estimates are that since 2002– 2003, first-year retention
has gone up 2 percent and the six-year graduation rate has increased 6 percent.
Drug, alcohol, and healthy lifestyle programs multiplied during the last three years from 44 to 97, and in the
same period reported alcohol and drug violations dropped from 36 to 18. In one year, people attending
fire safety programs went up from 282 to 368.
A Blueprint for Campus Accountability

Athletics. Assessment data in the 1990s eventually led to construction of a recreation and fitness center
with a pool. This in turn has led not only to an intercollegiate swim team, but to increased interactions
between students, who now have an additional place to meet.
Validation by outside groups. In recent years the University gained national professional accreditation for
the first time for its education school (from the National Council for Accreditation of Teacher Education),
and for its accounting program (from AACSB International). It earned renewal of accreditation, which
requires self-study and a several-day visit by a team of examiners from other campuses, for the business
school (from AACSB International), the psychology department (from the American Psychological
Association), the Physicians Assistant program (from the Accreditation Review Commission on Education
for the Physician Assistant, Inc.), and for the nursing school (from the American Association of Colleges
of Nursing’s Commission on Collegiate Nursing Education). The U.S. Department of Defense declared
the Seidenberg School of Computer Science and Information Systems a National Center of Academic
Excellence in information security.
Fellowships. In addition to its record number of Fulbright fellowship winners, 12 students won Jeannette
K. Watson Fellowships for three summers of internships and mentoring in government service, work in
nonprofit organizations or private enterprise; 10 won Rockefeller Brothers Fund Fellowships for Students
of Color Entering the Teaching Profession; four students earned summer research fellowships in science
from sources like the National Science Foundation; and one student, who earlier won a Fulbright to probe
how German land use laws might be applied to help reduce sprawl in the United States, won a Robert
Bosch Foundation Fellowship for an executive internship in Germany.
Professional licensing exams showed improved pass rates—nursing rose from 85 percent in 2002 – 2003 to
95 percent in 2005–2006; graduates of the Physicians Assistant program increased their rate from 88 to
100 percent.
Jobs. Every spring the University’s office of Co-op and Career Services, the largest optional program of its
kind in the New York metropolitan area, queries employers to find out how Pace students did who worked
for them. The results are not statistically tabulated, but are actively relayed to academic departments,
which frequently use them to revise their curriculums as Pace’s school of computing did when use of Excel
spreadsheets grew faster than it anticipated. (The office also queries students on how much employers let
them learn on the job, and confidentially uses the information to advise future students.)
A Blueprint for Campus Accountability

VII. Next Steps
Pace subscribes fully to the idea that self-assessment and self-improvement should be continuous. Pace’s
work to date has prompted ideas for a number of next steps. Many will be incorporated into two major
University wide efforts getting underway in the 2006 – 2007 academic year—developing a successor to the
current five-year strategic plan, and preparing the extensive information required for the next review of
the University in fall 2008 by its regional accrediting agency, the Middle States Commission on Higher
Education.* Plans and ideas for the future include the following.
Support grassroots assessments
n
A
s the culture of assessment spreads, a number of departments have increased or started their own
assessment projects. In the department of Computer Science, for example, faculty members became
dissatisfied with existing “field tests” of knowledge and skills and started pilot testing their own in the
spring of 2006. The assessment office has acquired software for creating survey questionnaires and
reporting results, making it available to departments and programs that ask to incorporate surveys in
their assessment plans.
Refine presentation formats
n
S tarting in 1997–1998, the University started experimenting with ways of compiling and sifting large
amounts of the data it was gathering.
The effort began with a “scorecard,” an attempt at setting up performance indicators to track Pace’s
progress on the flawed but influential measures used by U.S.News & World Report in its annual college
rankings. Seven years later, the “scorecard” is still presented to the board each year.
In addition, a much broader overall assessment format has been developed. Informally termed the
“assessment matrix” and prepared by the office of assessment, it is a spreadsheet compiling progress on
all the goals of the Pace strategic plan.
Analyze and use the data
n
I ncrease responses from units with self-assessments about their results and how they use or plan to use
them for improvement.
n
E
xpand feedback loops that let the findings of academic self-assessments be accessed and used by the
relevant departments and schools.
n
C
ontinue to share University self-assessments through the faculty and staff councils, including the
Faculty Institute and other faculty development programs.
*The Middle States Commission on Higher Education can be reached at 3624 Market Street, Philadelphia, PA 19104-2680,
telephone (267) 284-5000.
A Blueprint for Campus Accountability

n
I dentify ways to share specialized self-assessments like those for the library and information technology
with the people providing those services and other interested community members.
n
I ncorporate more self-assessment results into the budgeting process.
Increase participation
n
F
ollow-up with departments that do not yet have self-assessment plans to help them with plan
development.
n
I dentify ways to more fully involve students in the self-assessment process.
n
Identify
obstacles to self-assessment and using the results.
n
I ncrease the numbers of people involved in evaluating the data and making recommendations based on it.
n
F
oster receptivity to sharing “best practices.”
Build awareness
n
C
ontinue educating faculty members on the need and value of self-assessment in the teaching and
learning process.
n
K
eep the University community informed about the accountability issue in higher education and the
need to demonstrate accountability through self-assessment.
n
K
eep educating faculty members on the great variety of self-assessment methods and emphasize that
faculty members are not locked into one test or method.
Refine objectives
n
S tart formal faculty discussions on what skills, knowledge, and abilities should be assessed on an
institutional level and what tools might be used; what characterizes a Pace graduate? Begin compiling a
list of expected student learning outcomes for each program (these will be required by the Middle States
Association’s accreditation review), and identify assessment practices for each program.
Support change financially
n
I dentify more ways to support changes suggested by self-assessment findings, perhaps by expanding the
scope of presidential grants.
Gauge value added
n
A
n ultimate test of the new self-assessment systems at Pace will be developing the ability to assess value
added. Simply administering instruments like the CLA longitudinally, assessing students when they
arrive and again when they leave, will be a start. The overall effort is getting underway. David Caputo,
the president, has said: “We know that institutions that start with the better students will likely end up
with very high test results. But what is more important is learning what we add to a student’s education.”
A Blueprint for Campus Accountability

VIII. Lessons for the Future
Selecting from national self-assessment resources and drawing on its own, Pace University has tried to
avoid many evaluation pitfalls. As Peter Ewell of the National Center for Higher Education Management
Systems summarizes the most notable ones, the dangers are that “standardized examinations founded
on well-established psychometric principles, designed to produce summary performance statistics
quickly and efficiently,” nonetheless can be “excessively mechanistic,” encourage “teaching to the test,”
and reduce students to “products” and learning to a “merely additive” process. He says such testing,
sometimes reflecting “no child left behind” models, can ignore a growing consensus that “paths of student
development should not be seen as linear and additive but rather as organic and transformational.” How
do you measure appreciation of a poem or creative flashes of entrepreneurship?
Other observers on the national scene have worried that standardized, nationwide college assessment
could siphon money from poor colleges to rich ones by rewarding the good results that prestige
institutions get instead of bolstering institutions lower on the pecking order that could be most helped by
more resources. In public schools, evidence is mounting that the high-stakes testing movement has driven
people away from low-performing schools to alternatives, making the low-performing schools worse. Such
an outcome could, consciously or not, perpetuate the status of “advantaged” groups.
The reality of these concerns and others, including worries that testing will put money into the testing
industry that could be better spent on instruction and research, deserves continuing scrutiny.
Pace University estimates that it spends about $250,000 a year on self-assessment, for instruments
developed and scored by outside organizations, external reviewers for academic departments, and faculty
and staff time. In the University’s total budget of $270 million, this is about the cost of 250 desktop
computers
The game seems well worth the candle. As the Middle States Commission on Higher Education recently
noted, “The cost of not engaging in assessment may be even higher than the cost of doing so….
Institutions that do not … use outcomes information are less able to examine student development and
are not able to provide an adequate rationale for modifying existing programs or curricula….”
So far at Pace, self-assessment has been a lever for encouraging more active and coherent learning,
improved scores on professional exams, a rising level of overall quality, and increased striving for
continuous improvement.
Pace administrators stress that adapting self-assessment to the particular vision of their own institution
has been particularly valuable. As Pennipede puts it, “Not all campuses value service learning, community
connections and civic engagement as much as Pace does. Pace and all institutions have special qualities
that standardized measures cannot deal with—but that the universities and colleges can measure for
themselves.”
A Blueprint for Campus Accountability

Provost Morreale says, “The key ingredients in the success of a self-assessment program are connecting
assessment strategies to the mission and goals of the university, and departmental and school control of
tools and techniques to measure outcomes.”
Summing up, Caputo declares, “While there is no substitute for faculty members with rigorous minds and
empathic hearts, student assessments and evaluations provide the most important information for those
who want to be sure their students are learning. If they are not done comprehensively and systematically,
the quality of education has to suffer. Without measurable outcomes and information, no matter how
gifted the professors, teaching evaluation and effectiveness will remain an art when it needs to be blended
with well crafted science. I expect the Pace model to move us toward the science.”
A good chance exists that higher education can manage self-assessment and accountability better than
elementary and secondary schools have, balancing standardized tests with more subtle and humanistic
techniques. In the ongoing discussion, Pace will be happy to share its hard-won knowledge.
A Blueprint for Campus Accountability

Appendix: Other Self-Assessment Techniques at Pace University
Benchmarks
At the start of work on its current strategic plan, the University’s planning committee selected a set that still is in use of 12
benchmark institutions that Pace sees as peers or “targets of aspiration.” In the New York metropolitan area they are Fordham
University, Hofstra University, New York University, Seton Hall University, and St. John’s University, and across the country,
American University, George Washington University, Northeastern University, Temple University, the University of Miami
(Florida), University of Pittsburgh, and the University of Southern California.
Survey of student attitudes, expectations, and behaviors
n The
CIRP (Cooperative Institutional Research Program Incoming Freshman Survey). This survey was developed in the
1970s by the Higher Education Research Institute at UCLA. It provides baseline data for incoming students and allows for
trend analyses. At Pace it has provided information to help the admissions department assess its marketing and recruiting,
and to help administrators of first-year programs know what kinds of services incoming students expect. The College Student
Survey is the CIRP counterpart for seniors.
Surveys of factors influencing retention
n Your
First College Year. A companion of the CIRP Incoming Freshman Survey but given at the end of that year, this has been
used by Pace find out what students in the honors program thought of their college experience. An important finding was that
93 percent of respondents planned to attend Pace in the subsequent fall semester, which provided reinforcement for decisions
to continue and expand the program.
Surveys of fundamental educational processes
n H
igher Education Research Institute Faculty Survey. Provides information about faculty attitudes, experiences, concerns,
job satisfaction, workload, teaching practices, and professional activities. At the request of institutions belonging to Campus
Compact, including Pace, the 2004-2005 survey put special emphasis on civic engagement.
n Educause
Center of Applied Research Technology Survey. Just starting to be used at Pace with first year students and seniors,
this study’s goal is to gain an understanding of how students use information technologies and to compare Pace students with
those at similar institutions. The survey is expected to peg students’ technical skills, the extent to which they use instructional
technologies, their ability to use information technology to solve problems, and the degree to which they are technologically
prepared for graduate school or the workplace. Pace University also uses its own surveys to measure the faculty’s use of
information technology.
n LibQUAL+
Library Service Quality Survey. As a result of the 2003 assessment and other user feedback tools, the library on
Pace’s downtown campus was rearranged to separate quiet areas from areas for group study, computers were upgraded, and
computer authentication systems and the Web site were redesigned to overcome student difficulties when working from a
residence hall or home.
n Senior
Outcomes Survey. A survey developed by Pace’s office of Co-op and Career Services, this is mailed to seniors just after
graduation and asks whether they have been accepted in graduate schools or found jobs.
Pace also uses surveys of its own design to measure the outcomes of learning communities and courses incorporating “service
learning.”
Note: Because it is not being tested in New York State, Pace is not participating in pilot tests of the new instrument developed by
the National Center for Public Policy and Higher Education that is being considered by the National Commission on the Future
of Higher Education. Because of Pace’s previous commitment to pilot testing the Collegiate Learning Assessment, the University
had to decline to use a test of writing and critical thinking recently developed by the Educational Testing Service.
A Blueprint for Campus Accountability

Locally-developed departmental tests
Many departments at Pace develop their own instruments, often with help from Pennipede. Dissatisfied with existing “summative”
tests of knowledge in computer science, faculty members in that field at Pace’s Seidenberg School of Computer Science and
Information Systems recently designed their own. Covering subjects from Coding Constructs and Operating Systems to Data
Structures and Complexity Analysis, a trial version was administered in the spring of 2006. It was to be evaluated at the school’s
first Assessment Day, a 9:00 a.m. to 4:00 p.m. event designed to help faculty members evaluate all assessments completed during
the year and recommend improvements.
Admissions/enrollment/marketing surveys
Like admissions departments everywhere, Pace uses various measures to track the kinds of applicants it gets and loses to
competitors. Marketing surveys are used less regularly, but look into perceptions of the University’s value among potential
students and donors.
Centennial assessments
Though the Centennial was barely underway as of spring 2006, activities are being monitored for numbers of participants, breadth
of constituencies being reached, and media coverage.
Financial measures
Financial indicators ultimately are ways of tracking means to the end of learning. Those interested in more detail are invited to
write the Executive Vice President for Finance and Administration. Measures currently in use at Pace are:
Expendable Net Assets — unrestricted and temporarily restricted net assets minus fixed assets less long-term debt
Liquidity Ratio— investments (excluding endowment funds) against long-term debt outstanding
Adjusted Viability Ratio— expendable net assets, adjusted for the postretirement benefit accrual divided/long-term debt
Adjusted Primary Reserve Ratio— expendable net assets/ total expenses
Debt Service Ratio— debt service/total revenues
Maximum Annual Debt Service Ratio— highest level of annual debt service payable during the life of the bonds/current year
total unrestricted revenues
Total Debt/Total Capitalization Ratio— total short and long term debt/total debt plus total net assets Return on Net Assets Ratio— net growth in assets/net assets at beginning of year
Other Sources of Information
The Pace University assessment Web site, under “A” at www.pace.edu, has extensive annotated bibliographies and
links to sources ranging from books to professional and accreditation organizations. nload and distribute copies from
www.pace.edu/assessment
Pace University
ONE Pace Plaza
New York, NY 10036
www.Pace.edu
A Blueprint for Campus Accountability

The ideas and text of this report are for public use.
Please feel free to download and distribute copies from
www.pace.edu/assessment
Pace University
One Pace Plaza
A Blueprint
for Campus
New York,
NY Accountability
10036
www.Pace.edu

Download