Office of Educational Assessment

advertisement
A Roadmap to Documents
Demonstrating University of Delaware Compliance With
Characteristics of Excellence
NOTE: This document will illustrate how the University of Delaware described
evidence of compliance with MSCHE Standards 2, 7, and 14. Many of the links in
this narrative are restricted access, and in order to see the specific evidence cited, you
will have to attend the MSCHE Workshop on “Compliance Evidence – What is the
Commission Looking For?” at the Annual Meeting in December 2011.
Standard One: Mission and Goals
The University of Delaware is a Land Grant, Sea Grant, Space Grant institution, and its mission
is, at least in part, defined by the Federal legislation associated with those designations. A
comprehensive statement of institutional mission was presented to MSCHE in the University’s 2001
decennial Self Study, and was the driving force in strategic planning activity from 1991 through 2001.
http://www.udel.edu/middlestates/report/fullreport.pdf (See pp. 21-22)
In 2007, Patrick T. Harker became the University of Delaware’s 26th President, and he
immediately re-engaged the institution in strategic planning activity, including an examination and
restatement of the University’s mission. The current refined mission statement, approved by the Board
of Trustees, and incorporated into the University’s strategic plan, The Path to Prominence.
http://www.udel.edu/prominence/pdfs/Prominence_Plan.pdf (See p.2) The strategic planning activity
also entailed a series of revised goal statements associated with the University mission.
http://www.udel.edu/prominence/pdfs/Prominence_Plan.pdf (See pp.5-24) As noted, the University’s
Board of Trustees formally adopted the University mission statement at its May 2007 meeting.
http://www.udel.edu/PR/UDaily/2008/may/resolutions052108.html
Standard Two: Planning, Resource Allocation and Institutional Renewal
In the foregoing discussion of the University’s mission, it was noted that the mission statement
grew out of a comprehensive strategic planning process initiated in 2007 by President Harker. The
process was steered by a University Strategic Planning Committee, which delivered its report to the
President in April 2008. Of particular note is the fact that the Committee held over 100 meetings with
constituent groups on campus and from throughout the state and region in gathering input. Those
constituent groups are listed on pp. 20-21 of the Committee Report.
http://www.udel.edu/prominence/pdfs/SPC_Report_Apr_08.pdf The work of the Strategic Planning
Committee provided the framework for the development of the Path to Prominence strategic planning
document that the President presented at a University Forum in May 2008.
http://www.udel.edu/prominence/pdfs/Prominence_Plan.pdf. A detailed discussion of strategic
planning at the University of Delaware is found in Chapter One of the 2011 Reaccreditation Self Study,
which can be found at https://www.udel.edu/provost/selfstudy/
1
Concurrent with the development of the Path to Prominence six strategic initiatives, the
decision was made to move away from a centralized block budget resource allocation model to a
responsibility based budget (RBB) model under which the University’s seven Colleges would be
incentivized to develop resources and to more effectively utilize them in support of those strategic
initiatives. The underlying budget philosophy in RBB, and the steps in its implementation were
carefully explained through a series of presentations to academic and administrative personnel.
https://docs.nss.udel.edu/rbb-facstaff/resources/RBBUDOverview.pdf
Ongoing communication with campus constituencies with respect to all matters related to RBB is
achieved through a web site maintained by the Budget Office. http://www.udel.edu/rbb/
Standard Seven: Institutional Assessment
The University of Delaware has had a systematic approach to institutional assessment in place for
quite some time, measuring the extent to which the University is making effective and efficient use of
its human, fiscal, and physical resources in support of the institution’s mission. The overarching
framework for assessing institutional effectiveness is the University Mission Statement
http://www.udel.edu/IR/fnf/initiatives.html and the six strategic initiatives that have grown out of that
mission which are captured in The Path to Prominence Strategic Plan
http://www.udel.edu/prominence/. These strategic initiatives are further delineated in Standards 1 and
2. At the most global level, the University utilizes an electronic dashboard,
http://www.udel.edu/IR/UDashboard_p2p/, to assess the progress of implementing each of the six Path
to Prominence strategic initiatives. The indicators within the dashboard provide a bird’s-eye view of
the extent to which the University is making progress towards reaching the goals of those strategic
initiatives that are identified as central to fulfilling the institution’s mission. This dashboard is routinely
accessed by the President’s Senior Staff and the University Board of Trustees to calibrate progress in
implementing The Path to Prominence, and to develop appropriate policy initiatives to ensure its
success.
A comprehensive approach is taken to institutional assessment and is evident in the University’s
Institutional Effectiveness Assessment Framework http://www.udel.edu/IR/IEMatrixRev0910.pdf
which is reviewed on a regular basis. The assessment of institutional effectiveness begins with students.
The University regularly employs the College Board’s Admitted Student Questionnaire (ASQ) to assess
its place within the admissions marketplace. ASQ is the basis for developing admissions marketing
strategies at the University. The most recent survey administrations took place in 2005
http://www.udel.edu/IR/msche/undergrad/data/ASQ2005.doc and 2007
http://www.udel.edu/IR/msche/undergrad/data/ASQ2007.ppt. The results from the 2007 administration
of ASQ has provided empirical evidence that the University is attracting a more academically prepared
applicant pool than in the past, as well as evidence that the University is losing applicants to a different
set of competitor institutions, notably highly selective private universities. These data, along with the
perception among non-enrollees that the University of Delaware’s academic environment is less
rigorous than elsewhere, contributed, in part, to shaping The Path to Prominence strategic emphasis on
a more diverse and stimulating undergraduate experience, with special emphasis on greater student
engagement.
2
The University regularly administers the ACT Entering Student Needs Assessment Survey during
summer New Student Orientation. The survey asks entering students to identify areas where they
anticipate they will need help in such areas as academics, socialization, aesthetics, technology, etc. The
most recent survey administrations of the ACT Entering Student Needs Assessment Survey took place
in 2004 http://www.udel.edu/IR/msche/undergrad/data/ACTNeedsFall2004.pdf and 2007
http://www.udel.edu/IR/msche/undergrad/data/ACTNeedsFall2007.pdf. These data are routinely
disseminated among academic and student support units for evaluation of the fit between student needs
and existing services. The Office of Institutional Research also regularly administers a satisfaction
survey to undergraduate students. The ACT Survey of Student Opinions was administered in 2002 and
2006 http://www.udel.edu/IR/msche/undergrad/data/ACT_SOS_2006.pdf to assess student use and
satisfaction with programs and services at the University, as well as with 43 different characteristics of
the campus environment. During the spring 2009 semester, the Office of Institutional Research
administered a locally-developed Undergraduate Student Satisfaction Survey to a sample of students in
an effort to learn more about student opinion on a wide range of University programs, services and
campus characteristics http://www.udel.edu/IR/msche/undergrad/data/UGSatisFINAL09.pdf. Student
satisfaction research was the basis in the early 1990s for positioning the University as an early national
model for “one-stop shopping” in our Student Services Building, and continues to be a central tool for
evaluating the extent to which students use, and are satisfied with student services and student life. In
addition to an Undergraduate Student Satisfaction Survey, an Associate in Arts Student Satisfaction
Survey http://www.udel.edu/IR/msche/undergrad/data/AASatisSummaryReporFinal09.pdf and a
Graduate Student Survey
http://www.udel.edu/IR/msche/grad/data/GraduateStudentSurveyS09Complete.pdf were administered
in spring 2009. The Graduate Student Survey is particularly important because it underscores that the
student needs and expectations of that population are quite different from undergraduates. With The
Path to Prominence emphasis on expanding graduate education and research, these data will be
essential to attracting and retaining the caliber of graduate students who can, in fact, move the
institution forward.
While student satisfaction research provides a comprehensive overview of issues, occasionally a
more “drill down” approach is necessary. For example, the ACT Survey of Student Opinions identified
dissatisfaction with academic advising at the University. But what, specifically, are the issues related to
academic advising that engender dissatisfaction? To answer questions of this sort, the Office of
Institutional Research has instituted a series of “Campus Pulse Surveys.” These are short, web-based
surveys that focus on single issues and provide a more comprehensive portrait of student thinking on
various topics. Examples of Campus Pulse Surveys are those related to academic advising
http://www.udel.edu/IR/msche/undergrad/data/CampusPulseAdvising.pdf, issues of diversity
http://www.udel.edu/IR/msche/undergrad/data/CampusPulseDiversity.pdf, and issues related to
registration policies and procedures in a PeopleSoft environment
http://www.udel.edu/IR/msche/undergrad/data/CampusPulseRegistration.pdf.
Student engagement is a top institutional priority, and it is regularly assessed through the National
Survey of Student Engagement (NSSE). Administered to freshmen and seniors, it is cycled such that
freshmen to take it on a given administration are seniors on the subsequent administration.
http://www.udel.edu/IR/reports/nsse/index.html. NSSE will prove to be a central tool in assessing the
extent to which student engagement increases or is enhanced as part of The Path to Prominence
3
commitment to a more rigorous undergraduate academic experience. The University of Delaware also
monitors graduation and retention rates via annual participation in the Consortium for Retention Data
Exchange (CSRDE). Term-by-term retention and graduation rates for Newark Campus undergraduates
are submitted. Each year's report contains a University overall summary, as well as separate tables for
men, women, African-Americans, Whites, Hispanics, Asians, and Delaware residents and non-residents
with valuable benchmark information http://www.udel.edu/IR/msche/undergrad/data/CSRDE08-09_0107CohortsUDAnalysis.pdf.
Graduating students receive the Career Plans Survey that collects information on post-graduation
employment and graduate education. This information is then provided to the University’s Career
Services Center http://www.udel.edu/IR/reports/cplans.html. The Office of Institutional Research has
also administered a series of exit web surveys to graduating undergraduate and graduate students. The
exit survey is designed to capture students' level of satisfaction with the University of Delaware and
how the University has enhanced their life experiences. The most recent study is based on quantitative
data derived from three exit web surveys which were conducted in spring 2007, spring 2008, and spring
2009. The report summarizes the primary findings of the exit web surveys with particular attention to
undergraduate and graduate students as separate groups
http://www.udel.edu/IR/reports/exit/ReportonExitWebSurveys07_08_09.pdf.
In addition to the scheduled research surveys, studies, and reports http://www.udel.edu/IR/reports/,
the Office of Institutional Research receives over 150 ad hoc requests in a given year for data and
information by both internal and external constituencies to demonstrate institutional effectiveness.
Institutional effectiveness is dependent upon the extent to which the University understands its
position relative to appropriate comparator institutions on a broad range of key performance indicators.
Having selected a group of comparator institutions that encompasses both actual and aspirational peers,
the University’s Dashboard of Key Performance Indicators
https://www.udel.edu/IR/UDashboard_peers/ provides a clear roadmap with respect to areas in which
the institution needs to improve performance.
Institutional effectiveness is also dependent upon optimal performance within both academic and
administrative units. The University systematically and continuously assesses performance of units in
both the academic and administrative areas. Every academic program at the University of Delaware
undergoes an intense review every five to ten years. Academic Program Review (APR) includes the
development of a self assessment by the unit, followed by an external evaluation by appropriate
consultants. The APR process is described at http://www.udel.edu/provost/academicprogram.html.
Representative examples of APR self studies, consultant reports, and institutional outcomes are found
in the document room. Academic Program Review is supplemented by Student Evaluation of
Instruction. At the University of Delaware, each College is responsible for developing its own
instrument for student evaluations of instruction. They are encouraged to draw items from a common
pool http://cte.udel.edu/instructional-topics/course-evaluation-item-pool.html developed by the Center
for Teaching and Learning.
In the late 1980s, the University developed a set of budget support metrics for use in assessing the
teaching productivity and instructional costs of academic units within the institution
https://www.udel.edu/IR/bsn/. These metrics provided the basis for resource allocation and reallocation
4
decisions between and among academic units during the recessionary period of the early 1990s. The
metrics were expanded in 1992 to allow for inter-institutional comparisons, and became known as the
Delaware Study of Instructional Costs and Productivity (Delaware Study), which now encompasses
nearly 600 four-year institutions http://www.udel.edu/IR/cost/. A secure University of Delaware
Academic Benchmarking web portal https://www.udel.edu/IR/reports/benchmarking/index.html has
been developed to display the results of the most recent data collection (2007-08 and 2008-09 data)
from the Delaware Study. Three prior years are also provided for trend analysis. The data tables display
UD academic departments' measures on six discrete variables with national benchmarks calculated
from data submitted by research universities from across the United States. These variables include: (1)
Undergraduate student credit hours taught per FTE tenured and tenure track faculty; (2) Total student
credit hours taught per FTE tenured and tenure track faculty; (3) Total class sections taught (excluding
labs) per FTE tenured and tenure track faculty; (4) Total student credit hours taught per FTE faculty (all
categories combined); (5) Direct instructional expenditures per student credit hour taught; and (6)
Separately budgeted research and service expenditures per FTE tenured and tenure track faculty. These
data have taken on a special importance over the past two years as the University has transitioned to a
responsibility-based budget (RBB) model in which academic units are financially incentivized to
increase activity in teaching, research, and service. The Academic Benchmarking Site provides a
contextual basis for Deans to challenge their units to be more entrepreneurial. Because the Delaware
Study focuses only on externally funded activity as contextual information when examining teaching
loads and instructional costs, the University began using Academic Analytics in 2009. The Faculty
Scholarly Productivity Index (FSPI) examines publications, citations, research proposal activity,
patents, honors, and awards for faculty in all Ph.D.-granting programs in the country. Information
related to Academic Analytics specific to the University of Delaware may be viewed at
https://www.udel.edu/IR/Academic_Analytics/. Should either of the Generalist Evaluators wish to see
the FSPI Benchmarking site, which is proprietary, access can be granted by the Associate Provost for
Institutional Effectiveness.
The University of Delaware converted its administrative computing database from a mainframe
system (SIS) to PeopleSoft (UDSIS) and went “live” in 2006. PeopleSoft is a transactional database
and not intended for reporting. The delivered reporting tool from PeopleSoft would not allow reporting
across the Student, Finance, and Human Resources systems. For this reason, the University chose
Cognos as a reporting tool and the University of Delaware Enterprise Warehouse (UDEW) was
established. A resource website has been developed to support UDEW activities
http://www.udel.edu/IR/UDEW/. The website has information on how to gain access to Cognos, data
dictionaries, training documents, and RBB support. The goal of the UDEW group is to meet the data
needs of the University community by developing standard production reports such as the Grade
Distribution Report http://www.udel.edu/IR/reports/gradedist/index.html.
Ongoing assessment of institutional effectiveness is also accomplished through the University’s new
Performance Appraisal System which was introduced in 2008. The Performance Appraisal System
requires all employees, from the most senior to entry-level, to develop performance goals that are
specifically tied to the realization of the University’s strategic priorities as detailed in The Path to
Prominence. Academic and administrative units are accountable for the extent to which they contribute
to the implementation of the institution’s strategic initiatives, and compensation is tied to that
contribution. Conversations between employees and supervisors related to performance are not
confined to once-a-year. The Performance Appraisal System requires annual goal setting and
5
performance review every six months, at a minimum, and more preferably on a quarterly basis. Each
performance appraisal completed by a supervisor receives a second review by the supervisor’s
respective Vice President to ensure compliance with University strategic objectives. The Performance
Appraisal Process is detailed at http://www.udel.edu/administration/appraisal.html.
The University of Delaware is committed to being a good employer and to helping individuals
achieve their fullest potential. In order to fulfill this goal, an Employee Satisfaction Survey has been
administered to better understand the level of job satisfaction for faculty and members of the
professional, salaried, and hourly staff. Employee Satisfaction Surveys were conducted in 1995 and
2006 http://www.udel.edu/IR/climate/EmploySatisSurvey06ppt.pdf. A Campus Climate Survey was
developed and administered in 2009 with assistance from the Diversity Action Council, the Office of
Equity and Inclusion, and the Office for Institutional Research, as well as input from a variety of
campus groups. The overall purpose of the survey was to determine how faculty, staff, and students
perceive the campus working and learning environment. The survey was specifically designed to
analyze how welcoming and equitable the campus is viewed, to assess behaviors respondents have
experienced, and to solicit suggestions for improving the campus climate [Faculty and Staff results:
http://www.udel.edu/IR/climate/ExecSummary_Employee_ClimateSurvey.pdf; Student results
http://www.udel.edu/IR/climate/ExecSummary_Student_ClimateSurvey.pdf].
Standard 12: General Education at the University of Delaware
In 2003, the University of Delaware’s Faculty Senate adopted a General Education Program that
requires graduates of the institution to demonstrate mastery of specific skills and competencies at the
time that they receive their degrees. These general education goals are designed to prepare students for
life in the technologically sophisticated, diverse, highly communicative and globally integrated world
in which they will live and work; and to offer students the opportunity to expand their own horizons,
areas of interest and intellectual development. In consultation with the faculty, the Office of
Educational Assessment (OEA) has developed appropriate strategies for measuring mastery of these
general education competencies.
Report of the Faculty Senate Committee on General Education
Faculty Senate Resolution Adopting the Report of the Committee on General Education
http://www.ugs.udel.edu/gened/ : 10 Goals of Undergraduate Education
It should be noted that the 10 Goals of General Education are currently under review by the
Faculty Senate Committee on General Education, with an eye toward consolidating and reducing the
number of goals and focusing on more systematic measurement of competencies such as critical
thinking, quantitative reasoning, oral and written communication, information literacy, and global
awareness.
Currently, the University has five requirements of all undergraduates:

English 110 (3 credits)
6




First Year Experience (FYE) (x credits*)
One multicultural course (3 credits)
Discovery Learning Experience (DLE) (3 credits)
Four courses creating breadth (12 credits)
*The number of credits will depend on the designated FYE.
Students, both through participating in these requirements as well as in their broader academic
coursework, gain the skills and knowledge that will enable them to achieve the 10 Goals of
Undergraduate Education.
Assessment of General Education
At the most global level, the University has initiated a dual approach to assessing general
education, using a sample of 200 freshmen and 200 seniors. Because the University is a participant in
the Voluntary System of Accountability, a consortium of member institutions from the Association of
Public Land Grant Universities (APLU), the American Association of State Colleges and Universities
(AASCU), and the Association of American Colleges and Universities (AAC&U), we are required to
elect to participate in one of three prescribed standardized tests that purport to measure critical thinking
and communication competencies. To that end, we are administering the Educational Testing Service’s
Educational Proficiency Profile (EPP) to the aforementioned samples of students. However, because
there is considerably less than uniform support and enthusiasm at the University for standardized tests
that claim to assess general education competencies, we have asked the students within the samples
drawn fro EPP to voluntarily submit artifacts that they believe represent their best work. These
artifacts will then be evaluated using the AAC&U VALUE rubrics for critical thinking and written
communication. Those rubric-based evaluations will then be examined against the EPP tests scores for
those students to assess the commonality of findings. To the extent that these analyses are complete at
the time of the Document Review, the Generalist Evaluators will be directed to the appropriate site.
English 110
English 110, also known as First Year Writing (FYW), actively engages in assessment of
student learning. Most recently, the Director of Writing conducted an assessment of students’ expertise
in selecting and using sources in argument-based writing in both FYW and in the English Department’s
second-writing courses (English 301 and 302). While student achievement in ENGL110 aligns well
with the UD goals and expectations1, the data from the second-writing courses suggest students may
not retain this level of achievement. Further research into this apparent outcome is under way.
Complementing this assessment, an intensive study of the First Year Writing (FYW) compared to the
second writing courses within the Writing Program suggested that the stated goals of ENGL110 align
closely with the students’ impressions of the goals of their individual course sections. In addition, the
1
The Writing Program mission statement is available at http://tiny.cc/ez36l. The mission statement and
learning outcomes for 110 are available at http://tiny.cc/ez36l.
7
program contributed to the university-wide assessment initiative by using externally-constructed rubrics
created by The Association of American Colleges and Universities (AAC&U) that relate to “Critical
Thinking” and “Inquiry and Analysis.” Given that these externally-developed rubrics are not directly
connected to the goals of the Writing Program, the data obtained was of limited use in program
development. Nevertheless, using this assessment to compare ENGL110 essays written on campus to
those written by high school students enrolled in UDs dual-enrollment program did document success
in the latter program.
Even more useful than the data drawn from using these rubrics was the information yielded
from an assessment of FYW based around AAC&U’s oral communication rubric. For this assessment
revealed that a larger than anticipated number of writing program instructors are offering oral
communication instruction in FYW. In response to these data, the writing program is increasing its
opportunities for professional development in this area by offering both required all-program
development and optional workshops.
In addition to assessing FYW, the Writing Program has also assessed the outcomes of the
University Writing Center, which provides support for writing to students from across campus and
throughout the curriculum. For example, the Center has instituted an anonymous evaluation form that
users complete at the end of a tutoring session. In each of the past five years, the Center has created an
annual report documenting student usage of its services, including the demographics of the students its
serves. These reports have been provided to the Office of Educational Assessment and to the Dean of
Arts and Sciences and have helped inform decisions about the ongoing development and funding of the
Center.




2005-2006 Writing Center Assessment Report
2009 Writing Center Annual Report
2005-2010 Writing Center Top Accomplishments
2009-2010 University Writing Program Assessment Report
First Year Experience (FYE)
As part of the strategic initiative related to a more diverse and stimulating undergraduate
experience, the University of Delaware is committed to creating an imaginative and intellectually
stimulating First-Year Experience (FYE) that is fully aligned with the University’s General Education
goals. This new program provides a more supportive entry experience for undergraduates, one that
engages them as a part of the University community from the day they are accepted as students and
helps them prepare for academic success and a fulfilling college experience. In addition, the program
establishes strong academic and social bonds among students, enhances their identity with the
University as a community, and encourages and rewards faculty for active mentoring. The FYE
acquaints students with the broad array of University offerings on campus and across the globe and will
encourage their involvement in programs and organizations that enhance personal development and
promote learning.
All bachelor and associate degree undergraduate students entering the University in September
2005 or later are required to register and successfully complete at least one FYE. All bachelor degree
undergraduate students transferring to the University are not required to take a First Year Experience if
they have successfully completed at least once semester at another college or university. However,
8
certain First Year Seminars are required in the curriculum of particular majors regardless of prior
University transfer credits.
Since its introduction, First Year Seminar (FYS) has been continuously assessed and enhanced
in response to the results of these assessments. The most noteworthy of these changes include:

Restructuring the staffing of FYS. Initially, each section was led by a student Peer Mentor and
had a Faculty member who normally came to class only twice during the course of the semester.
Because students who made significant connections with faculty were achieving the desired
learning outcomes and felt much more positive about their transition into college life, the
University changed the FYS to be faculty-led with support from the Peer Mentor who serves as
a teaching assistant.

Establishing common learning goals. First Year Experience (FYE) is considered a hallmark of
the UD General Education (Gen Ed) program as well as a significant component to the UD
strategic plan. As a result, substantial resources are being invested into continuing to improve
the FYE. The University recently engaged an FYE Steering Committee to develop a more
comprehensive approach for students in their first year at the University of Delaware.
Specifically, the Steering Committee was asked to articulate and define a First Year Experience
that integrates the social and academic dimensions of being a student at UD. The Steering
Committee specified several learning goals (see below) that students should be able to
accomplish as a result of the FYE and specified the importance of integrating General
Education skills and knowledge into the FYE. Fall 2010 was the first time this new version of
FYS learning goals were implemented, and an FYS assessment will be implemented at the end
of the course in October 2010 to see the students’ self-reported gains on these skills as well as
their overall perceptions of the FYS.
FYS Goals






Identify how your personal decision-making can impact your wellbeing and your ability to
reach your goals.
Engage in experiences that contribute to your understanding of what it means to be a
respectful and contributing member of a diverse community and global society.
Develop a plan to ensure your academic success at UD while benefitting from the many
prominent resources available to you.
Begin to develop your ability to critically analyze and synthesize information.
Deliver, using appropriate technology, a reasoned, persuasive argument both orally and
in writing.
Develop mentoring relationships.
The reports of the First Year Experience (aka First Year Seminar) and Learning
Integrated Freshman Experience are found below.
9







2005 FYE Report
2006 FYE Learning Integrated Freshman Experience Report
2008 FYS E-portfolio Report
2008 FYS E-portfolio Survey Report
2009 FYS Report
2010 FYS Survey
2010 FYE Recommendations for Improvement
A common reader is required of all FYS to help students connect with the UD and to each other.
Based upon feedback from the 2009 entering class that students felt the transition topics and campus
resources information would have been more helpful if introduced earlier in the semester, the entire
schedule was revamped so that all topics are presented within the first 8 weeks of class. In addition,
last year students suggested that the common reader was not discussed and there were no follow-up
activities. This year, members of the FYS faculty were instructed to include relevant discussions and
learning opportunities for the students based upon the Tracy Kidder book, Strength in What Remains.
Also to follow up on the book’s themes of global citizenship and community assistance, students have
the opportunity to attend talks by additional speakers, including the author; as well as presentations;
and theater experiences.
Discovery Learning Experience (DLE)
At its May 3, 2004 meeting, the Faculty Senate approved a resolution that “the University
require all undergraduate students entering in September 2005 or later to take at least three credits of
Discovery-Based or Experiential Learning (e.g., an internship, a service learning course, an
independent study, participation in the Undergraduate Research Program or a Study Abroad program)
in fulfillment of their degrees.” Because there is a credit requirement, discovery and/or experiential
learning must have a course number assignation. The credit requirement may be satisfied in a single
course or in a series of courses, as long as a total of three (3) credits are earned. It is important to note
that the value of the DLE arises from intentional learning, not solely from engaging in the experience.
Because there are a myriad of ways that students can meet the DLE requirement, there must be a
common expectation of what learning should occur as a result of the experience. Both faculty and
students must be aware of this expectation. To that end, all DLEs must incorporate the following two
student learning goals:
1. Students will apply critical thinking skills and academic knowledge/concepts to develop effective
responses to, and make informed decisions about, problems or situations encountered in the course
of the learning experience.
2. Students will engage in reflection, which incorporates self-assessment and analysis of the learning
that has occurred as a result of their participation in the DLE. At a minimum, students will be
expected to examine and demonstrate what they have learned as a result of the DLE, how they have
learned it, the impact of their learning on their personal and professional growth, and how to apply
that learning in other situations or contexts.
10
To date, assessments of DLE have occurred within Study Abroad experiences, Service Learning
Experiences (http://www.servicelearning.udel.edu/assessment), and within the courses of individual
programs. For example, Geography faculty members Delphis Levia and April Veness received funding
from the OEA in 2009 to examine DLE courses and to examine student preparedness for the
experience. The results of their study were shared with the Faculty Senate as well as the Gen Ed
committee. The OEA also is in the process of developing a common metric adapted from the AAC&U
VALUE rubrics to be used in assessing student learning in experiences in all courses that constitute a
DLE. DLE guidelines require departments to report student learning assessment results. Beginning in
Winter session 2010, UD instituted a required reporting form that addresses DLE student learning goals
and assessment methods and results resulting from independent study projects. This form will allow
for data on student learning to be collected and analyzed.
Examples of DLE Assessments: Service Learning
 2007 Service Learning Report
 2008 Service Learning Community Partner Report
 2009 Service Learning Report
 2010 Service Learning Community Partner Feedback Form
Multicultural Course Requirement
The purpose of the multicultural requirement is to provide students with awareness of and
sensitivity to cultural pluralism—an increasing necessity for educated persons in a diverse world. The
University Faculty Senate certifies courses as meeting the multicultural course requirement (3 credits).
This requirement may be fulfilled through a course or courses taken to complete other course
requirements, but it cannot be fulfilled with any course taken on a pass/fail basis. Only course sections
that are designated as multicultural for a specific semester can be used to fulfill this requirement. In
general (though not in all cases), courses that are part of an approved Study Abroad program qualify as
a multicultural course.
Assessment of students’ global citizenship skills has been conducted in many ways. In the
Center for International Studies Study Abroad programs a pre/post measure has been implemented
since 2003. This measure repeatedly has indicated students’ significant gains (p<0.05) in cultural
knowledge and awareness. Additionally, the OEA and Institutional Research Offices have conducted
an assessment of the open-ended questions for the Study Abroad program by applying the AACU
rubric for cultural awareness. Because the rubric did not tie in to the learning outcomes as directly as
needed, the results were inconclusive. To remedy this, a new survey has been developed for the Study
Abroad program and the same measures will also be implemented for same courses on campus during
the Winter 2011 session.
For courses on campus, different programs have addressed global citizenship within their
majors. An exemplary example comes from the Women’s Studies program. After assessment of the
Capstone course indicated that students needed to improve their skills and tolerance of diversity, the
WOMS program purposefully embedded more opportunities to discuss the issues of gender through an
international lens. Furthermore, when the program was permitted to hire a new tenured track position,
the faculty selected the new faculty member, Pascha Bueno-Hansen, based upon her research into
international women’s issues.
11
Other on-campus assessments were conducted as part of the UD participation in a Fund for the
Improvement of Postsecondary Education (FIPSE) grant to assess diversity. UD helped to develop and
implement a pre/post questionnaire The Diversity Perspective Index (DPI). This survey instrument has
been administered on a pre/post basis in UD multicultural courses that satisfy a diversity requirement.
The items on this survey relate to perspectives of diversity in curriculum and in society. In the
administrations of the survey (Fall 2005 and Fall 2006), student perceptions changed significantly
(p<0.05) on 8 out of the 13 items.
Capstone Experience
The Capstone Experience requires students to integrate, synthesize, and reflect upon what has
been learned across an entire course of study. The Capstone Experience may take the form of a
traditional course, such as senior seminar, or it may also include or be entirely constituted by a field
experience, internship, career preparation experience, research, travel, exhibition, or portfolio. The
Capstone Experience may be discipline-centered or interdisciplinary. It may place the undergraduate
experience in a broad context that can be applied to students’ post-college lives.
Capstone experiences are suggested by the Faculty Senate but are not required. Therefore, it is
difficult to ascertain how many capstone courses actually exist on campus. Capstone experiences that
were developed in 2008-2009 include student learning assessment components (See CTL website for
final reports. The Faculty Senate has a process for certifying courses as a Capstone Experience. As part
of this process, the applicant department must indicate General Education learning goals met and
methods of assessment. Although, there is no follow-up process to ensure these learning goals are
being addressed and assessed, many programs are using their capstone experiences as an optimal
assessment point to check to see if their students are attaining the programmatic outcomes that are
linked to the Gen Ed goals. Therefore, we have some institutional evidence about the capstone
experience’s effectiveness in promoting a student’s ability to synthesize and apply knowledge at the
senior level and their ability to produce an intellectual product and present that product to a variety of
audiences.
Capstone Assessment
Art Capstone Report, 2008
Art Senior Portfolio Review, 2008
Capstone Survey Report, 2003 Spring
Capstone Inventory, 2009
Chemistry Capstone Assessment Report (CBC), 2010 - Goal 1 Goal 10
Entomology Capstone Results (UG Assessment Progress 2009)
Environmental Science 450 Proseminar, Spring 2009
Human Development & Family Studies Capstone Report, 2010
Instructional Grants for Capstones Development
Leadership Capstone with Rubrics
Management of Information Systems Capstone Report, 2009
Mechanical Engineering
Med Tech
Philosophy Capstone Results, 2008
Women’s Studies Capstone Report, 2010
12
Women’s Studies Capstone Report, 2009
The Office of Institutional Research annually conducts an exit survey of graduating students
and asks undergraduates to report on their attainment of the Gen Ed goals. The findings of this survey
reinforce the findings stated within this document. The survey can be found at this link and the report is
posted on the Institutional Research website
(http://www.udel.edu/IR/reports/exit/ReportonExitWebSurveys07_08_09.pdf)
Recommendations: Future Action Steps and Benchmarks for Establishing a Coherent and
Integrative Program of General Education and University Requirements
This section contains the recommendations from Dr. Karen Stein, Faculty Director of General
Education at the University as well as the Chair of the Faculty Senate General Education Committee.
Recommendation: Faculty Senate provides leadership in examining and promoting a cohesive
program of general education.
2012
o Reexamine current General Education goals for relevancy, clarity, and ability to impact
student learning; develop revised goals.
o Collaborate with the Center for Educational Effectiveness to articulate learning outcomes
for each general education goal.
o Implement a student survey on perceptions of how and if the multicultural requirement of 3
credits gives student a diverse perspective.
2013
o Ensure that all University requirements, including ENGL110, multicultural, FYE, DLE and
breadth requirements align with the General Education goals.
o Collaborate with the Center for Educational Effectiveness to develop university-wide
student learning goals for general education programs including the FYE, DLE, and
Capstone Experience. Ensure that these learning goals are progressive and lead to
developmental learning from the first year to the Capstone Experience in the senior year.
o Require e-portfolios in FYSs to engage students in their own learning and as a measure of
gains in general education skills and knowledge.
o Require capstone experiences for all students and develop processes for departments to
apply for capstone designations for new and existing courses.
2014
o Develop processes, for faculty to develop curriculum and prompts to capture students’
reflection about their learning and connect their programmatic learning outcomes to UD’s
general education goals.
13
o Obtain results for the Center for Educational Effectiveness to determine how well students
are achieving the desired competencies in general education programs including FYE, DLE,
Multicultural, and Capstone Experience.
2015
o University-wide requirement for e-portfolios as assessment tools for general education
competencies.
o Assess ongoing achievement of students competencies and make revisions as necessary
Ongoing
o Periodically examine general education goals for relevancy and priority.
o Use assessment data to support efforts to improve student learning and program
effectiveness in general education programs.
Recommendation: The faculty assumes leadership in communicating to students the importance and
value of general education so that students understand the reasons for, and the benefits of,
demonstrating their mastery of Gen Ed competencies through participation in activities and
assignments.
2012
o Course syllabi include general education goals and the relevance and connections to the
programmatic learning goals.
2013
o Faculty use University online DLE forms that specify faculty responsibilities for student
learning outcomes assessments and student responsibilities for demonstrating competencies.
2014
o Increased adoption of e-portfolios as an authentic means of documenting students Gen Ed
competencies.
Ongoing
o Use assessment data to support efforts to improve student learning and program
effectiveness in general education programs.
Recommendation: Central Administrators assume leadership in promoting General Education as a
priority for the University.
2012
o Students are introduced to student learning outcomes processes beginning with their
acceptance letter that outlines the opportunities they have to learn and grow through
University general education programs beginning with FYE through Capstone Experience.
o General Education Goals are clearly described in the online catalog and in official
University publications that describe academic goals of the undergraduate program.
14
o Institutional support for development and implementation of e-portfolios as an authentic
means of documenting students General Education competencies
The Office of Educational Assessment has purposefully assessed the General Education goals to
inform the University Faculty about the state of General Education. This review process has helped the
University consider how to systematize General Education to sustain the assessment process and ensure
that all programs have provided their students opportunities to acquire the General Education skills.
The following pages display the Office of Educational Assessment Plan to assess General
Education goals for the 2010- 2017 years.
15
Gen Ed Assessment Schedule
UD Gen Ed Goal
Fall 2008 –
Spring 2009
Fall 2009 – Spring
2010
Oral Comm.
Rubric- Assess
Undergrad
Research
Scholars
Revised AACU
Rubric- Assess
Undergrad Research
Scholars
Revised AACU
Rubric- Sample
UD Pop.
Undergrads
Revised AACU RubricSample UD Pop.
Undergrads
Rubric- UD Writing
Center -Assess of First
Year Writing and
Second Writing w/in
English dept. offerings
Rubric- UD
Writing Center Assess of First
Year Writing
Rubric- UD Writing
Center -Assess of Second
Writing w/in English
dept. offerings and
departmental sampling
Written Comm. and
Information Literacy
(Source use)
Fall 2010 –
Spring 2011
Oral Written and
Quantitative, Critical
Thinking
ETS Gen Ed test
EPP- Freshmen
and Seniors
All Gen Ed Goals (as
articulated within
undergrad
curriculum)
9- UD Eportfolio
Pilots- assess
with modified
AACU rubrics *
Fall 2011 –
Spring 2012
Fall 2012 –
Spring 2013
5- UD Eportfolio
Pilots- assess with
modified AACU
rubrics *
Fall 2013 –
Spring 2014
Fall 2014 – Spring 2015
ETS Gen Ed test
EPP- Seniors
ETS Gen Ed test EPP
Freshmen
5- UD Eportfolio
Pilots- assess
with modified
AACU rubrics *
5 UD Eportfolio
Pilots- assess
with modified
AACU rubrics *
5 UD Eportfolio Pilotsassess with modified
AACU rubrics *
Fall 2015 –
Spring 2016
Fall 2016 –
Spring
2017
5- UD Eportfolio
Pilots- assess
with modified
AACU rubrics *
Critical Thinking
Survey- First Year
Seminar
SurveyCapstones
Survey- First
Year Seminar
SurveyCapstones
Ethics to Community
Survey- First Year
Seminar
SurveyCapstones
Survey- First
Year Seminar
SurveyCapstones
CFIS-Pre/Post
Survey
CFIS-Pre/Post
Survey
Global
Citizenship/DLE
CFIS-Pre/Post
Survey
CFIS-Pre/Post Survey
CFIS-Pre/Post
Survey
CFIS-Pre/Post
Survey
* CFEE will continue to add eportfolios as financial support endures from the Provost's office.
16
CFIS-Pre/Post Survey
CFIS-Pre/Post
Survey
CFISPre/Post
Survey
Standard 14: Assessment of Student Learning
The University of Delaware takes very seriously its obligation to assist faculty in drawing
from best practices to describe and measure student learning. Such best practices include
multiple assessment strategies that are discipline-specific and appropriate to the relevant
pedagogies used in that discipline. The Office of Educational Assessment and the Center for
Teaching and Learning work collaboratively with faculty members across the institution to
encourage and support them in their efforts to design appropriate strategies for measuring student
learning outcomes at the course, program, college and institution-wide levels and to
systematically gather and evaluate information from those assessments.
Center for Teaching and Learning
The Center for Teaching and Learning supports student learning assessment by providing
resources and programs that assist faculty members to engage effectively in assessment
activities. These resources range from helping faculty members design learning-centered syllabi
that include specific learning outcomes and providing instructional grants that include the
requirement to assess student learning and using assessment results for curricular improvements.
Center for Teaching and Learning Resources:
Assessing Students
Instructional Grants
Instructional Topics
Teaching Services
Office of Educational Assessment
The Office of Educational Assessment, part of the Center for Educational Effectiveness,
supports academic units assessing undergraduate and graduate student learning. The Office of
Educational Assessment staff provides services that encourage program enhancements through
the implementation of systematic assessment of student learning. An assessment database that
stores all non-externally accredited departments’ plans, curriculum maps, General Education
assessments, data, and reports can be accessed by following the steps below:
1. Click on this link.
2. When prompted (you will be prompted twice) login with the username “viewer” and the
password “@$$E$$” (without the quotes).
3. You will then be presented a list of all of the programs in our database.
4. To filter the list, use the drop-down menus at the top of the page. For example, select
“College of Health Sciences” from the “College” drop-down menu to only see data from that
17
program. (Note: The “Department” menu is sorted in an unorthodox manner, so typing the
first few letters of the department you are searching for after you click is the easiest way to
filter by department.)
5. To view the data for a particular program, click on the “Details” link on the left-hand side of
the table.
6. To see the full text of any field in the database that is truncated and followed by “…”, place
your mouse over the text for a few seconds and a tooltip will appear with the complete text.
Assessment reporting is continuous; each program is required to provide annual documentation
on at least one of their programmatic outcomes. This is the reporting template. As of Spring
2011, an automated web-form will be available. Chairpersons and directors of academic units
will be required to submit the form annually to their Dean’s office as part of the process of
budgetary meetings and requests for resource allocations. The form will also be routed
automatically to the Office of Educational Assessment for ongoing documentation of student
learning.
E-Portfolio Initiative
The Center for Teaching and Learning, the OEA, and various units comprising Institutional
Technology have embarked on a pilot project to institute an e-portfolio system across the
curriculum. The system is designed to support teaching, learning, and assessment, and is capable
of capturing assessment data at both the programmatic levels and for general education. The
prognosis for the e-portfolio initiative is extremely bright considering the support from the
Provost and the future plans for expanding the Writing Across the Curriculum program and
future planned assessments. Winter Faculty Institute UDaily article. The University of
Delaware is one of 21 campuses nation-wide to be awarded a 3 year “Connect to Learning:
ePortfolio, Engagement, and Student Success” grant to strengthen best practices in ePortfolio
pedagogy. Funded by the US Department of Education’s Fund for the Improvement of PostSecondary Education (FIPSE), UD will use a structured matrix model to further develop and
implement a reflective teaching, learning and assessment ePortfolio system that supports the
University’s strategic initiative of strengthening student engagement, and provides
documentation of student learning for departments and academic programs.
Grantees were selected through a highly competitive application process. In awarding the grant
to UD, the application review team wrote: “(We) selected your proposal for its vision, thoughtful
planning, and promise of benefit to your students and faculty, as well as your potential
contribution to our collaborative effort to generate evidence-based models for reflective
ePortfolio practice.” Dr. Randy Bass of Georgetown University and Dr. Helen Chen
of
Stanford University will serve as the project's senior research
scholars. Dr. Bass will also
serve as a consultant to UD at the January 6th, 2011 Winter Faculty Institute.
The “Connect to Learning” project is a collaborative effort by the Center for Teaching and
Learning, Office of Educational Assessment, and the IT-Client Services and Support offices.
For further information about UD’s ePortfolio initiative see: http://www.udel.edu/e-portfolios/
18
Assessment of Professional Programs
As is the case with many institutions with professional programs, accrediting agencies
such as the Accrediting Board for Engineering and Technology (ABET), the Association to
Advance Collegiate Schools of Business (AACSB), the National Council for Accreditation of
Teacher Education (NCATE), among others, have an extensive tradition of assessing student
learning outcomes. Representative examples of assessment within professional programs at the
University of Delaware may be found at:
Business (2010 Fall - UDLerner_AACSBMaintAccredReport.pdf)
Environmental Engineering:
Self Study
http://data.assessment.udel.edu/assessment_system/files/College%20of%20Engineering/Center%
20for%20Energy%20and%20Environmental%20Policy/Environmental%20and%20Energy%20P
olic/2005%20Summer%20-%20ABET%20Self-Study.ENEG.Final.pdf
Appendix
http://data.assessment.udel.edu/assessment_system/files/College%20of%20Engineering/Center%
20for%20Energy%20and%20Environmental%20Policy/Environmental%20and%20Energy%20P
olic/2005%20Summer%20-%20Appendix.ENEG.final.pdf
Chemical Engineering:
http://data.assessment.udel.edu/assessment_system/files/College%20of%20Engineering/Chemic
al%20Engineering/Chemical%20Engineering/2005%20Summer%20-%20ChemEngABET05.pdf
Computer Engineering:
http://data.assessment.udel.edu/assessment_system/files/College%20of%20Engineering/Electric
al%20and%20Computer%20Engineering/Electrical%20and%20Computer%20Engine/2005%20
Summer%20-%20cpeg_final.pdf
Mechanical Engineering:
http://data.assessment.udel.edu/assessment_system/files/College%20of%20Engineering/Mechani
cal%20Engineering/None/2005%20Summer%20-%20Self%20Study%20ReportMechanical%20Engineering--62805-all.pdf
Medical Technology
http://data.assessment.udel.edu/assessment_system/Programs/Details.aspx?College=4&Departm
ent=3720&Major=N/A
19
Nursing
http://data.assessment.udel.edu/assessment_system/files/College%20of%20Health%20Sciences/
School%20of%20Nursing/None/2005%20Fall%20-%20CCNE%20SelfStudy%20Report%20Fal...pdf (See pp. 39-51)
Physical Therapy
http://data.assessment.udel.edu/assessment_system/Programs/Details.aspx?College=4&Departm
ent=2591&Major=N/A
Faculty Assessment Scholars
With the exception of programs such as Chemistry and Biochemistry, which is accredited
by the American Chemical Society, and Foreign Languages and Literature, which is accredited
by the American Council on the Teaching of Foreign Languages, programs and disciplines
within the College of Arts and Sciences have no historical tradition or external mandate other
than that of the Middle States Commission on Higher Education for systematic assessment of
student learning outcomes. Consequently, the University has begun an innovative approach to
building an infrastructure for measuring student learning. The Office of the Provost has
provided funding to create a panel of Faculty Assessment Scholars to work with the Office of
Educational Assessment and the faculty within the College of Arts and Sciences to establish best
practices for assessment of student learning outcomes within the College’s programs and
disciplines. Three Faculty Assessment Scholars—Iain Crawford, English, Delphis Levia,
Geography, and Don Lehman, Medical Technology—currently advise the Office of Educational
Assessment about assessment practices, as well as perform audits of the database and make
recommendations for programmatic improvements. The following table summarizes their
evaluation of assessment activity within six disciplines in the College of Arts and Sciences, and
is followed by a more detailed discussion of that assessment activity to provide the readers of
this document review with a more comprehensive sense of where the College is headed.
Sample Table of Assessments
Program
Assessment Stage
Learning
Curriculum Rubric(s)/
Closed
Goals
Mapping
Measurement Assessment
loop
S
S
P
S
Chemistry &
Biochemistry
P
English
S
Foreign
Languages &
Literature
S
Physics &
Astronomy
Continual
Assessment
Overall
Rating
S
S
W
S
W
S
–
S
W
S
W
S
P
W
–
P
P
20
Psychology
Women’s
Studies
S
P
S
W
P
S
W
S
P
S
W
S
Assessment classification key: S = strong; P = promising; W = weak; – = not applicable
Representative Analysis of Assessment Activity Within the College of Arts and Sciences
By Assessment Scholars: Iain Crawford, Associate Professor, English, Don Lehman, Associate
Professor, Medical Technology, Delphis Levia, Associate Professor, Geography, and Kathleen
Langan Pusecker, Director of Educational Assessment
Many of the University of Delaware’s Colleges are professionally accredited (Business/AACSB;
Engineering/ABET; Education/NCATE; Health Sciences/NLN, etc.) and have a long history of
systematic outcomes assessment. As is the case with many universities, assessment of student
learning within the College of Arts and Sciences has been somewhat uneven. The following
analysis is representative of the University’s approach to bringing Arts and Sciences into the
assessment movement.
Narrative of Strong Assessment Activity: Chemistry and Biochemistry
Strengths
The Department of Chemistry and Biochemistry in the College of Arts and Sciences has pursued
an active assessment program. It has established 10 high-level Department learning goals, many
of which are matched to the University’s General Education Goals. Their curriculum map
clearly identifies in which courses the Department Goals are covered. The Department has
documented direct and indirect measurements of assessing their learning goals. Students in the
program are administered a senior seminar survey, and several of the items in the survey are
linked to the Department learning goals providing indirect evidence of meeting the goals. The
program also has direct evidence from student scores on the American Chemical Society (ACS)
standard General Chemistry examination.
The Department has used data to make changes to the curriculum and assessment methods. A
rubric and a student self-evaluation form are used to evaluate student oral communication skills.
Based upon an analysis of the student survey, it was found that students lacked self-confidence in
oral presentations. As a result, the Department changed the oral presentation rubric and decided
to provide the students with better feedback. The documented scores on the ACS examination
indicate students are performing overall at a proficient level. However, the ACS exam identified
one area of weakness: aqueous equilibria, particularly acid-base and sparingly soluble systems.
Students appear not to be learning this material well and/or not integrating that knowledge well
between courses. As a result, the undergraduate curriculum committee will be asked to discuss
this challenge and develop proposals for the better integration of this important topic among the
courses involved. The Department is using data from assessment to influence the curriculum.
21
Weaknesses and Plans to Rectify Weakness
Additional assessment tools should be developed. The ACS exam is administered early in the
students’ course of study as the final exam for CHEM 112; perhaps a similar direct measure
could be used in senior courses. Suggestions for additional assessment methods include
interviews of graduating students, monitoring awards, admissions to graduate and professional
programs, and scholarships received.
Narrative of Strong Assessment Activity: Foreign Languages and Literature
Strengths
The Department of Foreign Languages and Literature has conducted a world-class assessment
program over the past several years. It is clear that an active assessment culture is in operation
and the department uses assessment to effectively improve student learning. Learning goals are
well defined and linked with General Education goals. The curriculum mapping is excellent and
there are multiple rubrics to effectively meet the diversity of programs within the Department.
The assessment reports are meticulously done with an eye to detail. Both direct and indirect
methods of assessment are effectively employed to gauge student learning. The assessment
program also is aligned with external accrediting bodies in some cases, such as the American
Council on the Teaching of Foreign Languages, for the language education program.
Weaknesses
More diversity with respect to indirect assessment methods may amplify the scope of the student
learning process. It is recommended that the Department examine other options for indirect
measures of student learning.
Plan to Rectify Weaknesses
In addition to student surveys, other methods of indirect assessment may be utilized. Perhaps,
entry and exit surveys, alumni questionnaires, or a tabulation of student honors, awards, or
scholarships could be used to broaden the suite of assessment tools.
Narrative of Strong Assessment Activity: Women’s Studies
Strengths
The interdisciplinary program in Women’s Studies has made substantial progress in assessment
and is evidently committed both to collecting and reflecting upon data about student learning
experiences and outcomes. In 2006, the program developed eight learning goals. These goals are
specific and concrete and, in a number of cases, illustrated by detailed examples. The program
has also made students aware of the goals through publishing them in its annual newsletter.
Beginning in 2008, the program has developed survey instruments for its majors and collected a
significant amount of data. These instruments include a survey of where the various learning
goals are addressed within the curriculum, a survey of a study abroad program in India, and a pre
22
and post-capstone senior survey. It is noteworthy that each of these instruments gathers data on
specific issues and also provides respondents with the opportunity to offer qualitative, openended feedback about particular topics and the program as a whole. Although the number of
respondents was small, the survey of the study abroad program was exemplary in that it:
collected a wide range of data on 23 separate points; gathered extensive qualitative feedback; and
prompted considerable reflection by the faculty. The most productive of the instruments appears
to have been the senior survey. This has been administered three times (2008-09) and been
modified each year in the light of previous experience. As in the case of the study abroad survey,
the senior survey collects a wide range of information and has evidently prompted considerable
faculty reflection and discussion of the program as a whole. It appears that this discussion has in
turn led to some curricular revision, but the document listed in the Assessment Database was not
available for review.
Opportunities for Continuing Development
Of the eight goals the program has defined for itself, only one refers specifically to a concrete
student outcome and to how majors will “demonstrate” particular skills. Two of the goals refer to
what students will “understand,” and four refer to what students will “examine.” Understanding
is a broad term that is not easily assessed, while examining refers to student activity rather than
the actual outcomes of learning. The program might find that re-examining the formulation of
these goals would make the work of assessing them more practicable. There is no curriculum
map in the Assessment Database. Constructing such a map can be particularly challenging for an
interdisciplinary program that, essentially, relies upon course offerings from a number of
departments. However, that reliance can also be seen as making such mapping even more
important than is the case for disciplinary programs and, given that the program draws upon
faculty from multiple departments, it should at the least map outcomes for the four courses that
are required for the major. The program’s assessment work to date has relied upon student
survey and thus upon essentially indirect measures of learning. The next stage for the program
should be to develop its first direct measures, gathering data that is not dependent upon student
self-reporting.
Summary
Overall, Women’s Studies has developed a strong culture of assessment and made substantial
progress in gathering and reflecting upon data about student learning. These accomplishments
are all the more commendable, given the interdisciplinary nature of the program and the fact that
it does not enjoy the disciplinary unity and administrative infrastructure of the academic
departments. The program has clearly established a strong foundation for continuing success in
assessing its effectiveness. A natural next step would be to move from indirect to direct measures
of assessment and, to this end, some re-examination of the learning goals formulated in 2006
might be helpful, since their restructuring may well point to appropriate means of assessing
them.
Narrative of Promising Assessment Activity: Physics and Astronomy
Strengths
23
The Department of Physics and Astronomy in the College of Arts and Sciences has just recently
initiated an assessment program; first data submitted are from a 2008 student survey. The
Department initially established 10 high-level learning goals for undergraduate students. Many
of the goals were linked to the University’s General Education Goals. The learning goals are
assigned to one of three levels: L0, L1, and L2. Each level has been mapped in the curriculum.
To determine to what extent the program has met the goals, the Department plans to collect exam
score data and to administer a survey to the graduating seniors and to conduct exit interviews.
Subsequently, the Department established 46 learning goals. To assess these learning goals,
course instructors were asked to include in their exams common problems that addressed specific
learning goals and to report the proficiency of the students. Proficiency ranged from 40% to
90%. Unfortunately, the problems were not linked to individual learning goals. The Department
noted that approximately a quarter of physics majors are struggling in their sophomore (300level) classes. The Department plans to monitor the students to see if proficiency improves in
higher level courses. The Department also has written six learning goals for students in graduate
programs, three for Masters of Science and three for Ph.D. Each of the six learning goals has
been mapped to courses in the curricula. The Department has listed several direct methods of
assessing learning goals including candidacy exam scores, public presentation and written
reports, and thesis defense.
Weaknesses and Plans to Rectify Weakness
The Department currently has 46 learning goals, which might be excessive and will be difficult
to assess. It would be beneficial for each undergraduate learning goal to be mapped to a course
instead of mapping the three categories: L0, L1, and L2. This would provide more detailed
information to guide curricular changes. Initial interviews of seniors (2008) have identified
some areas of weakness. However, the Department has not yet submitted a plan that rectifies the
problems. A Department committee should review the areas of weakness and propose curricular
changes to address them. The entire faculty could then meet to discuss the proposed changes.
The survey the Department is using is short and does not address many of the department
learning goals. Additional survey questions should be added that relate to the learning goals.
Data from the survey would then be a useful indirect measure of assessment. Survey results
could be monitored over time to detect trends. The Department will be monitoring student
proficiency on a set of common problems. The Department should also investigate changes to
the 300-level courses to improve proficiency on those problems. The Department has only
identified two undergraduate assessment methods (i.e., a survey and exams), additional
assessment methods are needed. It was not specified how exams would be used to assess if the
Department was meeting the learning goals. The department could develop a standardized
examination for one of their senior level courses. Some of the questions could be linked to
Department learning goals. If the same questions are used in multiple years, this could help the
Department determine if curricular changes are having the desired effects. Additionally, exam
questions in other courses could be used to monitor if the Department is meeting its goals. Such
direct measures allow the Department to monitor student progression through the curriculum.
While the Department has established learning goals and identified two assessment methods for
undergraduates, they have not documented clearly how the data will be used. For the graduate
programs, only direct measures are planned. The Department needs to utilize some indirect
measures for the graduate programs. It is suggested that the Department forms a committee to
develop additional assessment methods and interpret the data collected.
24
Narrative of a Strong Assessment Activity: Biological Sciences
Strengths
The Department of Biological Sciences in the College of Arts and Sciences was one of the early
adopters of assessment of student learning and has tremendous leadership from the assessment
chair as well as critical faculty participation. The Department has a curriculum map and during
summer 2010, asked faculty to review the map to ensure adequate coverage and assessments of
student learning outcomes. Students in the program annually participate in a graduation survey
that provides faculty with feedback about the students’ self-reported confidence in attaining the
programmatic outcomes. This indirect evidence is used to inform the faculty about students’
perceptions about the program and their own learning. Direct measures of student learning have
included examinations of scores on quantitative biology problems to assess students’
competency with the department’s goal #3: Understand and apply mathematical approaches
to analyze, interpret and model biological processes.(GenEd: Quantitative Reasoning).
Students were rated by faculty as needing improvement in the quantitative biology skills required
of the major. For their goal #6: Demonstrate writing and oral communication skills
important for communicating scientific ideas. (GenEd: Communications Goal), faculty
teaching Bio 207 developed a lab report rubric to assess the quality of the scientific writing.
Students’ writing performance was rated as needing improvement by faculty.
Results
For Goal #3, the Department used data to make changes to the curriculum in two ways. First,
after implementing a direct assessment measure of students math skills in Biological Sciences, it
was indicated that Biological Science students needed more quantitative reasoning skills then
what was being developed via the normal Mathematics sequence. The Department then met with
the Mathematics faculty to discuss the development of a Math course that would contain
Biological data so that students would better connect Math and Science. The successful results
of this study were published see, A Transformative Model for Undergraduate Quantitative
Biology Education CBE Life Sci Educ 2010 9: 181-188. [Abstract] [Full Text] [PDF]. The
second way that the curriculum changed was to provide additional training for the Biology
faculty members about how to enrich their courses with more quantitatively demanding
laboratory and classroom problem sets. For Goal #6, faculty provided additional training to TA’s
about scoring student work and providing feedback. The Writing Center also was enlisted to
provide support for students’ writing in the curriculum. The Writing Center now also possesses
the lab report rubrics so that their tutors can better advise biological science students.
Implementing rubrics has provided clearer expectations for student learning and not surprisingly,
has also improved student performance.
Weaknesses and Plans to Rectify Weakness
A large core faculty is participating in the Howard Hughes Medical Instruction grant and is
reconfiguring their intro curriculum and measuring the changes on the course’s learning
outcomes which directly connect to the Biological Sciences as well as Chemistry programmatic
25
outcomes. That being stated, the department should also survey alumni and employers of
students to ensure that curricular changes align with current demands in the field.
Narrative of a Promising Assessment Activity: Psychology
Strengths
The Department of Psychology made a strong beginning to its assessment work in 2008 and,
more recently, has built upon this first initiative with a series of important further steps.
In 2008, the department reviewed the 40 learning goals defined by the APA and identified a subset of 14 as those most relevant to its own goals. It then developed a faculty survey in order to
narrow down this sub-set to 4-5 goals that would be adopted as the primary goals of the program.
Conceptually, this was a strong approach: by drawing upon the APA literature, the department
was aligning itself with best national practice in the discipline; and the sub-set of APA goals it
selected includes learning outcomes that are exceptionally sophisticated and well-defined.
Indeed, these are outcomes that could serve as models for other programs. The survey was later
conducted, and it led to the creation of a document outlining the specific learning goals upon
which the department decided. Complementing the definition of these goals, the department has
also created a comprehensive map of its curriculum. As a first actual assessment exercise, the
department administered an instrument to a large group (391 students) of its majors. Faculty
participation in this exercise was excellent, with all but one member of the department involved.
Using a survey and series of test questions, the exercise explored the outcomes of the program’s
Research Design and Stat courses, generating significant data that should be invaluable to its
discussions and future planning of the curriculum.
Weaknesses and Plans to Rectify Weakness
Given the high quality of this recent work, it is unfortunate that the department’s APR in 2009
includes almost no discussion of assessment: other than a brief discussion of a 2006 survey on
advising, the self-study report makes no reference to learning outcomes or assessment. Rather, it
focuses upon traditional input measures of undergraduate student quality, such as the SAT scores
of incoming students or the GPA’s of Psychology majors. Overall, then, the report focuses on
what might be described as traditional, static metrics focused upon inputs: data such as these
indicators of student quality, the number of faculty lines, etc. As a result, the report has no
opportunity to examine issues of the quality of student outcomes produced for students in the
program. For example, while the report does note that growth in the number of students served
has prevented the department from offering many writing-intensive courses, it does not include
any data about the writing abilities of Psychology majors and how these might have been
affected by the dearth of writing instruction throughout the curriculum. One opportunity for
bringing together the APR process and the more recent strong work in assessment might be to
build on the department’s noting in its APR report that is has no information on the career
choices of its majors. A useful step, then, could be to conduct a senior survey that would
examine student perceptions of learning in these areas, could provide qualitative feedback on the
program as a whole, and could offer preliminary information about career choices. The latter,
together with information from Career Services and/or alumni surveys, would generate extensive
material to inform the department about the career pathways its students follow, the kinds of
26
skills those careers require, and the extent to which the curriculum successfully prepares students
for those careers. Overall, Psychology is at a promising point in its work on assessment. After
some delay, and despite not having addressed assessment in the APR, the department has put
together a strong set of learning goals and a comprehensive curriculum map. It has also carried
out a large assessment exercise and generated revealing and useful data about its students’
progress in the major. There is thus now a strong foundation in place for effective future work.
Narrative of Weak Assessment Program: English (B.A. Program)
Strengths
The Department of English has begun the assessment process. A number of learning goals are
well defined. A partial curriculum map has been formulated. In addition, a writing rubric has
been partially developed and initial assessment activity has taken place in two courses. Reading
through the Department’s assessment materials in the on-line database, one gets an inkling that
an incipient culture of assessment may be beginning to emerge. The Department is at a critical
crossroad in terms of program development and assessment. They have a golden opportunity to
use assessment to build their program in a meaningful way that supports their vision of the
future.
Weaknesses
At present, the curriculum map and writing rubric are incomplete. There appears to be no
synergy between programmatic development and assessment. Despite these shortcomings, the
Department is well poised to use assessment to develop their undergraduate program for the
future. A detailed plan of action is set forth in the following section.
Plan to Rectify Weaknesses
The Department of English is large and complex with multiple concentrations within the major.
These concentrations include: literature, creative writing, drama, film, and ethnic and cultural
studies. The diversity and intricacies of each of these concentrations necessitates assessment and
programmatic development at the concentration level. Because the Department is grappling with
curricular change and corresponding self-evaluation and reflection, they are uniquely positioned
to dovetail an incipient assessment program with their curricular changes. Five major steps are
recommended to couple assessment and curricular change to build an excellent program with
congruence between student learning goals and outcomes: (1) redefine learning goals for each of
the five concentrations within the English major; (2) develop curriculum maps for each
concentration; (3) formulate rubrics with clear criteria to measure the level of student learning
directly within each concentration; (4) develop a suite of innovative indirect assessment tools;
and (5) utilize the assessment to further refine the curriculum and improve student learning.
The English faculty is encouraged to develop a set of learning goals particular to each
concentration. This invariably will involve protracted discussions by faculty (within given
concentrations) concerning the end goals of their curriculum. Such discussions will be fruitful as
they can lead to a concrete direction for the concentrations in the form of meaningful learning
27
goals. In this way, programmatic assessment and curricular change occur in tandem and
reinforce each other in a synergistic way. Armed with clear learning goals, the faculty should
map the curriculum within each concentration to ensure that students are exposed to the learning
goals. Curriculum mapping may very well foster discussions on curriculum reform as students
need to be exposed to the learning goals throughout their undergraduate experience.
Development of concentration specific rubrics will then compel faculty to set clear criteria as to
how students will be assessed vis-à-vis the learning goals. This may prompt discussion of
various pedagogical techniques to ensure students are meeting specific learning goals. Such
direct measures of assessment should be scaffolded to ensure that incremental advances are
being made at various stages within the curriculum. Assignment and concentration based indirect
assessment methods may include student questionnaires on the congruence between rubrics and
student learning goals or entrance and exit surveys. The results of the assessment efforts, perhaps
in the form of e-portfolios, should then be utilized to close any gaps in the curriculum where
student learning is impaired. Although the assessment culture is not yet deeply rooted in the
Department, it is clear that the shallow assessment roots in situ could be used in an efficient and
effective manner to guide programmatic development. The Office of Educational Assessment
looks forward to working with the Department to build a strong future by coupling programmatic
assessment with curricular change.
In addition to the foregoing analysis of assessment activity in Arts and Sciences, the Faculty
Assessment Scholars also provided the following recommendations to the Faculty Senate and the
Provost to ensure the quality of programs.
Recommendations
On the basis of funded assessment initiatives from the Office of Educational Assessment for
quantitative reasoning, discovery learning experience (DLE), and the writing program, the
following recommendations are made:

The University of Delaware should develop an effective QL/QR (quantitative
literacy/quantitative reasoning) instrument for assessing changes in students’ QL/QR
proficiency during their four years at the University of Delaware. An effective plan
would not only include the instrument itself but also an effective data collection scheme.

The University of Delaware should initiate a series of conversations on campus about the
most effective means of achieving university level QL/QR skills. This discussion should
include consideration of a Mathematics Fellows program based on the successful Writing
Fellows program. One possible approach would be to integrate QL into courses across all
disciplines and programs. This conversation should also include a review of successful
QL programs at other institutions.

Social science DLE courses that rely on a repertoire of knowledge, skills and experiences
acquired largely during that course are appropriate for sophomores, juniors and seniors,
because students in each of these levels are starting at the same point in their learning.
28
For this reason, we recommend that social science DLE courses only be available to
sophomores, juniors and seniors.

Natural science DLE courses should be reserved for junior and senior level students that
have acquired the requisite background required to be able to reflect meaningfully on the
subject matter and its relationship to the larger discipline.

The University should periodically review courses listed as DLE to be sure course
content actually reflects a DLE.

A University-wide study of second-writing courses should be commissioned using
insights and proposed methods from the 2009 Writing Program Assessment to gain a
better understanding of the impact of second writing courses and students’ writing skills.
Such studies of second writing may need to collect writing samples over multiple
semesters in order to have an adequately varied data base, and the creators of such
assessments might want to mix essays from first year writing with those from second
writing courses so that raters cannot identify the course number.

University administration should develop incentives for faculty to incorporate "writing
across the curriculum", as well as offering continuing education on teaching writing in
non-English courses.
29
Summary
The University of Delaware takes seriously its commitment to systematic assessment of student
learning outcomes. Assessment activity within the professional disciplines at the University is
both comprehensive and long standing in terms of faculty engagement. Assessment within
disciplines in the College of Arts and Sciences has a spottier history with respect to
systematizing the process, although most disciplines have assessed learning in one form or
another. The work of the Faculty Assessment Scholars reflects an intentional and disciplined
approach to organizing measurement of student learning within the College of Arts and Sciences.
The following table, and its associated links, summarize where the various programs and
disciplines across the University are in developing a mature outcomes assessment process. The
table below was created by the Assessment Scholars and is broken into three distinct columns.
Column one contains the assessment categories that the Scholars focused upon. Column two has
examples of exemplars for each category and column three Future Progress provided their
recommendations for ongoing improvement.
Standard 14 Table
To view many of the links below, you need to be signed into the Office of Educational
Assessment’s database:
Enter Username: viewer
Enter Password: @$$E$$
Assessment
Categories
Outcomes
Exemplary Example URLs
Future Progress
External Accreditation
● Maintain alignment
with evolving
standards of external
accreditors
Business- AACSB
Environmental Engineering-ABET (Self-Study |
Appendix)
Chemical Engineering-ABET
Civil Engineering-ABET (Self-Study | Appendix)
Computer Engineering-ABET
Ed Tech-NCATE
● Address any
inconsistent practices
with assessment and
alignment to guidelines
Electrical Engineering-ABET
Mechanical Engineering-ABET
Med Tech-NAACLS
● increasing
implementation of
scaffolding in
assessment progress
Music-NASM
30
Nursing-CCNE (See pp. 39-51)
●
Physical Therapy-CAPTE
further emphasis on eportfolios to chart
student progress
Professional Guidelines
Chemistry-ACS
FLL-ACTFL
Disciplinary Norms
Geography
Philosophy
HRIM
Women’s Studies Study Abroad Report
E-portfolio Projects
Assessment
Categories
Outcomes
Exemplary Example URLs
Future Progress
External Accreditation
● Maintain alignment
with evolving
standards of external
accreditors
Business- AACSB
Environmental Engineering-ABET (Self-Study |
Appendix)
Chemical Engineering-ABET
Civil Engineering-ABET (Self-Study | Appendix)
Computer Engineering-ABET
Ed Tech-NCATE
● Address any
inconsistent practices
with assessment and
alignment to guidelines
Electrical Engineering-ABET
Mechanical Engineering-ABET
Med Tech-NAACLS
● increasing
implementation of
31
scaffolding in
assessment progress
Music-NASM
Nursing-CCNE
●
Physical Therapy-CAPTE
further emphasis on eportfolios to chart
student progress
Professional Guidelines
Chemistry-ACS
FLL-ACTFL
Disciplinary Norms
Geography
Philosophy
HRIM
Women’s Studies Study Abroad Report
E-portfolio Projects
Measures
Animal Science Rubric Presentations
Art
Black American Studies Capstone Rubric
Bio-resources Engineering
Entomology Graduation
Linguistics Grad Program
Entomology (Grad Survey | Rubric)
Environmental Science
Engineering Technology
Fashion Apparel Design
Food Science Grad Survey
Health Exercise Science
Human Development and Family Studies Rubric
Medical Technology
Nutrition
Sport Management
Visual Communication Senior Rubric
32
● target departments and
programs with lesswell developed
assessment programs
and conduct workshops
on rubric development
●
Provide additional
training for
departments to conduct
alumni/employer
surveys
●
Continue to evaluate
programmatic learning
goals and conduct
Evidence
Reflection
(Closing the
Loop)
Asian Studies (Report)
Animal Science
Associate in Arts Written Communication
Biological Sciences
Business
Chemistry
Communication
Electrical Engineering PhD
Entomology & Wildlife Ecology
Hotel, Restaurant, & Institutional Management
(Survey)
Mechanical Engineering ABET
Music
Nutrition
Women’s Studies
Writing
Animal Science
Environmental Science
FLL
Communication (Report about COMM 212)
Math
Medical Technology
Music
Physics
Plant and Soil Science- Graduate Program
Project Management
Quantitative Biology
Institutional Assessment Fellows Program
Webform Annual Assessment Report
Support
ongoing curricular
maps
● implement a more
diverse array of
indirect measures of
student learning to
enrich and broaden
assessment evidence
●
Require direct
measures for each
assessment
●
Institute workshops for
training faculty in
diverse measurements
●
Provide more
incentives to expand
the culture of campus
assessment
●
explicitly link
assessment practices
with Path to
Prominence
●
formulate an action
plan to involve the
Assessment Scholars in
academic program
reviews
●
continue funding
assessment projects
Center for Teaching and Learning Instructional
Grants
Office of Educational Assessment Grants
Levia and Vaness-DLE
33
that shed light onto
student learning and
that can lead to
recommendations to
the Faculty Senate
Ianetta Writing Center Report
Quantitative Reasoning- Rossi and Daley
Anderson- Face to Face versus Hybrid course
delivery in large survey courses
●
connect assessment to
budgetary decisions
●
Strengthen institutional
research capability to
support assessments
within department
(Data Warehouse)
Request for Proposals 2010- ePortfolios
34
Download