Annual Report 2012-2013

advertisement
Assessment Steering Committee
ANNUAL REPORT ON ASSESSMENT AT PURDUE UNIVERSITY NORTH CENTRAL
19 April 2013
A system of continuous reflection and improvement is integrated into each of the twenty-two
degree programs at Purdue North Central. This responsibility lies with each of the sponsoring
departments. The Purdue University Senate By-Laws commissioned the Assessment Steering Committee
(ASC) to provide some monitoring and oversight, specifically, “to ensure the design, implementation and
application of the results of the various plans for assessing the effectiveness of student learning, General
Education, degree and certificate programs and instruction and administration.” (Article III F.2) This
report seeks to update the Senate on the state of assessment at Purdue North Central.
Monitoring Assessment on Campus
The 2009-2010 Purdue system-wide preparations for accreditation produced the Boilermaker
Accreditation and Learning Outcomes Tracking Site known commonly as BALOTS.
(https://outcomes.itap.purdue.edu/) Though intended as “an important tool for documenting
assessment
efforts
in
a
systematic,
structured,
and
unified
manner,”
(http://www.purdue.edu/accreditation/2010/criterion/criterion3.php), BALOTS has fallen out of use.
The Director of Assessment for the Office of the Provost explained, “We quit using BALOTS after the
accreditation visit because everyone hated it.” Discussion with PNC program coordinators revealed
dissatisfaction with the tools and a cursory review of some the information entered into BALOTS
revealed it is a mechanical recording of information largely devoid of meaning (example, the BLS BALOTS
report). The ASC has, therefore, improvised with a Qualtrics survey.
Several attempts by the ASC to administer the Qualtrics survey via e-mail brought only scant
responses. The four Deans were contacted and with the aid of the Associate Vice Chancellor for
Academic Affairs 13 unique programs responded to the survey between May and November 2012. In
the College of Engineering, only CIT responded to the survey. The Dean believes, with good reason, that
the Senate has no real authority over the departments since it is not truly a representative body. Should
the Qualtrics survey be used again two strategies should be adopted to increase the response rate: a
working session in a computer lab where program coordinators are invited to work with the ASC to
complete the survey and an active partnership between the ASC and the Administration.
The format and mechanics of the survey facilitated some information gathering about
assessment across the responding programs but also presented some problems:

Programs were identified based on their answer to the first question on the survey.
o If the first question was marked incorrectly or left blank it was difficult to identify the
program.
1
o


Users were not able to start the survey and save their work for later leading in some
cases to duplicate responses.
Sometimes it was difficult for the Committee to follow what the program coordinator was
referring to (e.g. they may refer to a capstone project or a survey without explaining the
project/survey).
Two different views of the results proved necessary: one that aggregated the quantitative
results and one that showed the full set of responses for an individual responder.
Potential improvements readily come to mind. Future surveys that require a username should
permit the periodic saving of work in progress. Users should be made aware about the purpose of the
first question. Members of the ASC should have access to the survey so that the responses can be
checked periodically and reports can be printed in the desired format as needed without repeatedly
asking for help from Maria Watson, the survey administrator. Program coordinators should be directed
to describe the methods used, not just identify the objectives and discuss the results. Purdue North
Central may also consider replacing the Qualtrics survey with TracDat software developed by
Appalachian State University and about to be purchased by West Lafayette.
The 2012 Qualtrics survey yielded results in two areas: Essential Learning Outcomes, programs
specific outcomes. The existence of outside accreditation is an important factor.
 Essential Learning Outcomes:
The campus as a whole has seriously begun to assess the Essential Learning Outcomes adopted in
2010 as part of an effort to specify and articulate general education objectives. These standards created
by the Association of American Colleges and Universities (AAC&U) are pursued in each degree program
both in the common General Education Core and in discipline specific curricula.
Number of programs
Knowledge of Human Cultures and the Physical and Natural World
Through study in the sciences and mathematics, social sciences, humanities, histories,
languages, and the arts
6
Intellectual and Practical Skills, Including Inquiry and analysis, Critical and creative thinking,
Written and oral communication, Quantitative literacy, Information literacy,
Teamwork and problem solving
10
Personal and Social Responsibility, Including Civic knowledge and engagement—local and global,
Intercultural knowledge and competence, Ethical reasoning and action, Foundations and skills
for lifelong learning
6
Integrative and Applied Learning, Including Synthesis and advanced accomplishment across general
and specialized studies
9
Several programs have assessed multiple Essential Learning Outcomes:
 Behavioral Science
 Business
 Human Resources
2
 Liberal Studies
 Nursing
 Organizational Leadership
The remaining programs reported only one Essential Learning Outcome.
Reporting programs used a variety of tools to gather information including pre/post-tests,
capstone projects (utilized by several programs), surveys, mapping objectives, and assessment of a
redesigned course. Twelve programs rated their assessment of the Essential Learning Outcomes as
somewhat satisfied, satisfied or very satisfied. One program claimed to be very dissatisfied but this may
have been a misstatement. The rationale for the rating provided by the program that was very
dissatisfied was: “because we have worked very hard as a department to ensure that our assessment
measures are in alignment with our accrediting body and effectively measure the standards we have
established for our programs and those established by NCATE.” This response does not sound negative,
and therefore may indicate that the respondent intended to choose very satisfied rather than very
dissatisfied.
The rationales reported by the programs that were satisfied fell into two categories which can
be paraphrased as: We are pleased with our start at assessment, but more work is needed and we have
worked very hard on assessment and are pleased with the results of our efforts.
Reflection: We were pleased that all of the programs who responded to the survey assessed at
least one of the ELO. In addition, the fact that each of the ELO was assessed by at least 6 programs
shows that the campus as a whole is assessing the ELO. Like the respondents, we too are pleased with
the work that has been done so far to assess ELOs on the PNC campus. We will strive to help programs
continue their assessment. The results, however, are based on self-reporting. The ASC did not actually
look at the raw information.
 Program specific outcomes:
Twelve of the thirteen reporting programs identified at least one program-specific outcome that
was assessed in the last year. Several responded that they had assessed “all of them”. Several programs
mentioned outside accreditation (recently recognized or pending). Several programs reported that they
now “know where we stand” or have established a baseline.
In self-rating of their program-specific assessment, 11 programs chose somewhat satisfied,
satisfied, or very satisfied. One program chose very dissatisfied and 1 program did not submit a
response. The program that chose very dissatisfied was the same one that did so for the ELOs and they
provided the same rationale: “because we have worked very hard as a department to ensure that our
assessment measures are in alignment with our accrediting body and effectively measure the standards
we have established for our programs and those established by NCATE.” This provides further evidence
that they accidentally chose the wrong end of the scale because it is clear from their responses that they
are satisfied with their program-specific review.
3
Of the 12 programs that answered the satisfaction question, 11 provided the exact same rating
as they did for their assessment of the ELOs. (One program rated their assessment of ELOs somewhat
satisfied and their assessment of program-specific outcomes satisfied.)
Reflection: Approximately half of the programs that responded to the survey report to an
outside agency (7 of 13). With one exception, programs that have (or are seeking) outside accreditation
report that they have assessed several outcomes whereas programs without outside accreditation tend
to focus on a specific outcome. The ASC may want to consider the pros and cons of depth vs. breadth
and/or raise awareness on campus about the differences between different programs. Do the
independent programs know what the accredited ones are doing and vice versa? Which method, if
either is more productive? Is it better to assess multiple outcomes every year or tackle them one at a
time? Does the appropriate method vary by program?
Assessment Fest V
Purdue North Central's Fifth annual Assessment Fest took place on January 10, 2013. After beginning in
2009 as a way to raise awareness about assessment on campus, the format of the Assessment Fest has
changed over the past five years to meet the changing needs of the faculty as they mature in their
efforts to review and improve their programs. The annual Assessment Fests have effectively fostered a
culture of assessment on the campus. Vice Chancellor Schmid calls it the most important event of the
Spring.
This year marked the first time that a panel discussion was implemented. Faculty from each of
PNC's Colleges representing a wide range of disciplines and assessment backgrounds served on the
panel:
College of Business
– Cynthia Robert, Interim Dean
College of Engineering and Technology – Thomas Brady III, Dean
College of Liberal Arts
– Mary Jane Eisenhauer, Assistant Professor of Education
College of Science
– Jason Curtis, Associate Professor of Biology
After brief introductions focusing on their experiences and philosophies regarding assessment, the panel
accepted questions from the audience. The ASC was pleased with the result as many members of the
audience contributed questions, comments, and examples from their own experiences. The discussion
focused primarily on how to assess objectives that are not easily measured quantitatively such as
whether a student is prepared to act ethically in their field. Because of the faculty interest, this topic will
be incorporated into the ASC's activities for the coming year.
The social atmosphere of the Assessment Fest contributes to their success. Food contributed by
the members of the ASC, the Associate Vice Chancellor Jayasuriya and Vice Chancellor Schmid included
Sri Lankan curried chicken, a Polish stew called bigos, vegetarian chili, salad and drinks. Music
accompanied lunch and ended when the panel discussion. Response from audience seemed positive
though attendance topped at 63, slightly down from the previous year.
4
Reflection: Audience participation was up due to the format. Should panel format will be used
again next year the topic will focus on assessing the teaching of ethics and other broad topics. Such
qualitative/subjective/difficult to assess objectives will require more time. Assessment Fest VI should
last from noon (the real time will depend on morning workshops) till 1:30 pm. Next year’s panel may
include not only representatives from the four Colleges but also a member of the General Education
Committee.
Program Review Board
Based on feedback received from the most recent HLC visit, PNC has established a program
review board which will periodically review each program on campus. The committee reads materials
provided by the program, attends a presentation, and writes a short report to provide feedback and
recommendations. In addition, it is hoped that the process will help to share good practices and
disseminate what is learned across campus. The program review board consists of faculty from each
college, a representative of the PNC advisory board, a representative from enrollment, and a student
representative, and is chaired by Dean Morrow. Although the program review board is not affiliated
with the ASC and serves a broader role, it seems that the program review process will augment the
ASC’s efforts to assess student learning. The current plan is for one or two programs to be reviewed
annually and the process repeated on a five year cycle. The ASC will monitor the assessment information
between reviews.
Purdue System Plan
A Purdue system plan is currently in development, and assessment will be a part of the plan.
Drafts of the plan, which will not be complete until May, have identified three goals pertaining to
assessment: (1) That all programs will have up-to-date assessment plans focused on student learning
outcomes, (2) That system-wide tools , training, etc will be available to aid in assessment, and (3) The
development of an integrated system for assessment, program, review, and accreditation. PNC is well
on its way in terms of the first goal. Cooperation with other campuses will be needed to achieve the
other two.
Mission and Challenges
The culture of assessment at Purdue North Central changed as it became a standard part of the
operations of academic programs. The initiative for assessment of student learning has thus shifted
from the ASC to the academic departments. This is a natural and logical development. Consequently, a
gap is developing between assessment practice and the functions of the ASC stipulated in the PNC
Faculty ByLaws. The framers of the PNC Faculty Bylaws envision a powerful ASC with a scope of activities
that went far beyond “steering”:
Article III.F.2 of the PNC Faculty By-Laws assigns the Assessment Steering Committee a set of ambitious
tasks:
5





“monitoring, assessing and reporting to Faculty Senate at least quarterly the progress and
results of assessment activities at Purdue University North Central.”
“interpreting the results of such assessment activities in relation to the stated goals and
outcomes of the relevant unit.”
“interpreting the results of such assessment activities in relation to the stated goals and
outcomes … of Purdue University North Central as a whole.”
“report this interpretation at least annually as directed by the Agenda Committee.”
“The Assessment Steering Committee shall, in consultation with the various academic units and
groups and their assessment committees, annually review and recommend to Faculty Senate
procedural guidelines for the Campus Student Learning Assessment Plan.”
These are to include at least the following:



“the assessment of student learning outcomes”
“the assessment of the various degree programs and of the General Education program”
“the assessment and evaluation of instruction and administration”
The ASC has not fully met this commission in living memory. The ASC has promoted a culture of
assessment through self-reporting by the degree programs themselves. A culture of assessment has
taken hold though in the newly assessing degrees not answerable to outside agencies it may be
functioning haltingly and without enthusiasm. The ASC has continued a system of voluntary selfreporting from degree programs. An annual Qualtrics survey has replaced BALOTS. But the ASC has not
examined on its own how well the degrees and General Education programs have met their own stated
goals and those of PNC as a whole. There is no independent verification. The ASC does not have the
membership nor, in most cases, the expertise for the task. There is also concern that such supervision
would draw resentment as interference into the internal functioning of academic department. The
Committee is handicapped by the voluntary character of the Faculty Senate which leads to legitimate
concerns over its authority vis-à-vis the academic departments.
The scope and depth of the ASC needs to be reevaluated as it continues its mission of
monitoring assessment (Qualtrics survey) and fostering a culture of assessment (Assessment Fest).
Collaboration between the General Education Committee and the ASC needs to be established as the
general education objectives and requirements have been established and are starting to be assessed.
The current system of assessment where the responsibility rests solely with the individual programs
does not adequately prepare PNC to assess the general education core. Assessment should be split into
two components, campus-wide and program-specific. Campus-wide objectives for what a Purdue
graduate should be, know, and do need to be assessed by faculty from across the campus. Because the
faculty is charged with control over curriculum, review of that curriculum is the responsibility of the
faculty as a whole.
The ASC also needs to deepen its partnership with the Office of Academic Affairs. The Office is
responsible for the continuing accreditation of the campus which requires an effective system of
student learning. The ASC and the Office of Academic Affairs will draft a “Campus Student Learning
6
Assessment Plan”. A Task Force may be established to develop this plan should membership in the ASC
continue to be inadequate.
The state of assessment on campus is mixed. Programs with long traditions of vigorous review
as part of national accreditation have made assessment an integral, natural and everyday activity. For
these, the assessment loop is operational and effective. Programs that do not answer to outside
agencies have the more challenging task of working out their own assessment plans and maintaining the
constant effort to fully implement those plans on a continuing basis. Experience indicates the need for
disciplined sustained self-motivation and frequent monitoring by departments, the Faculty Senate
through the ASC and by the Office of Academic Affairs.
On a more general issue, the term assessment should fade from usage as its connotative
meaning has proven to be a handicap. Continuous reflection and improvement of courses and programs
is the most natural direction of devoted pedagogy. We teach, we reflect, we improve in a never ending
cycle.
2012-2013 Assessment Steering Committee
Jessica Thomas, co-chair
Janusz Duzinkiewicz, co-chair
Debra Hollingworth Pratt, member
Kumara Jayasuriya, Vice Chancellor of Academic Affairs
7
Download