2007-08 Assessment Report St. Cloud State University

advertisement
2007-08 Assessment Report
St. Cloud State University
Submitted to Provost Michael Spitzer on February 17, 2009
Prepared by
James Sherohman, University Assessment Director
Elaine Ackerman, College of Education
Wendy Bjorklund, College of Fine Arts and Humanities
Holly Evers, University Assessment Office
Carol Gaumnitz, Herberger College of Business
Christine Inkster, Learning Resources and Technology Services
James Knutson-Kolodzne, Student Life and Development
Joseph Melcher, College of Social Sciences
Amos Olagunju, Undergraduate Studies
Maria Womack, College of Science and Engineering
1
Table of Contents
Sections
Assessment of Student Learning in the Five Academic Colleges
Page 3
Assessment of Student Learning Centered Outside of the Five Academic
Colleges
Page 5
Recent Accomplishments and Next Steps
Academy for the Assessment of Student Learning
Page 6
Page 10
Supporting Documents
Appendix A. Reports of the College Assessment Directors
Page 14
College of Education by Elaine Ackerman
College of Fine Arts and Humanities by Wendy Bjorklund
College of Science and Engineering by Maria Womack
College of Social Sciences by Joseph Melcher
Herberger College of Business by Carol Gaumnitz
Appendix B. Summary Information on Academic Programs to Accompany
Appendix A
Appendix C. Summary of Activities and Likely Impacts of 2007-08 Assessment
Grants
Appendix D. Progress Toward Goals of Student Learning Projects for HLC
Assessment Academy
Page 36
Appendix E. Report on Undergraduate Studies by Amos Olagunju
Page 74
Appendix F. Report on Learning Resources and Technology Services by Christine
Inkster
Page 79
Appendix G. Report on Student Life and Development by James KnutsonKolodzne
Page 88
Page 55
Page 62
2
Assessment of Student Learning in the Five Academic Colleges
2008 Program Assessment Reports
The academic colleges use a standard procedure for gathering assessment information from
programs and compiling this information into college reports. Each major, free-standing minor,
graduate, and free-standing certificate program is expected to submit an assessment report
annually. Programs are encouraged to use the recommended template for these reports, in
order to facilitate the aggregation of information. The reports are submitted to the college’s
assessment director, who compiles a summary report for the college. Appendices A and B of this
report include information from the college reports. Appendix A consists of narrative
descriptions written by the college assessment directors. Appendix B consists of a chart that
summarizes information extracted by college assessment directors from the annual assessment
reports, which were submitted by academic programs. The program-level reports, themselves,
stay in the college.
Based upon information provided by programs in their 2008 assessment reports, college
assessment directors recorded whether each program accomplished the following tasks:
1) Did the program assess student learning outcomes that are included in its assessment
plan?
2) Were findings reported on any of these outcomes?
3) Did the program use direct measures of student learning?
4) Did program faculty discuss assessment processes or findings?
5) Were any changes proposed based upon data collected this year?
6) Were any changes implemented based upon data collected this year or in previous
years?
7) Were data collected in an effort to see if changes that have been implemented had the
desired effect?
Appendix B provides a program-by-program description for these tasks. The table below
summarizes this information by college:
COE
CoFAH
COSE
COSS
HCOB
Total
1) SLOs assessed
2) Describe findings
3) Direct measures
4) Discussions
5) Changes proposed
6) Changes implemented
7) Data collected on changes
Total number of programs
72%
77%
87%
82%
49%
23%
15%
39
66%
68%
45%
66%
52%
16%
5%
44
42%
29%
40%
32%
34%
23%
19%
68
34%
21%
31%
31%
17%
9%
0%
58
92%
50%
58%
58%
42%
0%
0%
12
53%
44%
48%
49%
36%
17%
10%
221
Programs reporting
% of all programs reporting
36
92%
23
52%
39
57%
24
41%
11
92%
133
60%
The bottom row of the table shows that 60% of all programs submitted reports. The
percentages in the top seven rows of the table are based upon the total number of programs,
not just those that submitted reports.
3
A major goal for 2009 is to elevate these percentages by increasing the number of programs
that submit reports. A second goal is to increase the percentage of programs that implement
changes based upon assessment findings.
Several patterns pertaining to the colleges are worth noting:
1) Programs in the Colleges of Education and Business were most likely to submit an annual
assessment report, while those in the College of Social Sciences were least likely. Although
the overall reporting rate clearly was disappointing, several factors help to explain why the
rate was so low in some colleges. First, although historically the focus of assessment has
been on major and graduate programs, this year an effort was made to include freestanding minor and certificate programs. (Free-standing programs have student learning
outcomes that are not subsumed by another program, such as a major program or a
graduate program.) Second, BES majors tend to be difficult to assess, and many of them did
not submit reports. Third, some departments submitted reports for some but not all of their
programs, and some departments that submitted no program assessment reports did
submit an upper-division writing assessment report or a general education assessment
report. In other words, the percentage of departments that submitted at least one report of
any kind is substantially higher than the percentage of programs that submitted reports.
2) Programs in the Colleges of Science and Engineering and Education were most likely to
“close the loop” by implementing changes based upon assessment findings and by collecting
data on the impact of these changes.
3) Programs in the College of Education are especially likely to use direct measures.
4) Some of the differences among colleges may stem from coding differences among the
college assessment directors. They worked from common guidelines, and they discussed
these prior to processing the program reports, but this may not have been enough to ensure
that they interpreted the criteria uniformly. Additional attention will be devoted to this next
year.
Assessment Plans
The Assessment Steering Committee has established a goal that the assessment plan for every
academic program be linked to or posted on the Assessment website. Assessment plans help
ensure that key learning outcomes for the program are identified and given sufficient attention
in the assessment process. Having this information available on the Internet benefits program
faculty, students, and members of assessment and curriculum committees. The Assessment
Steering Committee has identified four essential elements of an assessment plan. The table
below shows progress over the past year in the percent of major programs for which each part
of the assessment plan has been posted.
Assessment Plan Component
2007
2008
Mission Statement
85%
96%
Student Learning Outcomes
50%
74%
Program Matrix
12%
46%
Timeline
7%
24%
4
These figures provide conservative estimates of how many major programs have completed
their assessment plans; some programs have completed components but not posted them.
However, there are some noteworthy patterns in this table:
1) Substantial progress has been made over the past year in completion of assessment
plan components.
2) Many programs still do not have complete assessment plans.
3) The program matrix and timeline are the components of the assessment plan that are
least likely to have been completed.
In 2009 the Assessment Steering Committee will make a concerted effort to substantially
increase the percentage of programs that have complete assessment plans.
Upper-Division Writing
Twenty programs submitted assessment reports on upper-division writing. This is a substantial
increase over last year. Clarification of oversight responsibilities for the assessment of upperdivision writing probably would result in a further increase in reporting. The General Education
Committee has recommended that college assessment committees receive the reports and
provide feedback to departments, much like what is being done now with the program reports.
General Education
Nine programs submitted assessment reports on general education courses. Because SCSU is in
transition to a new general education program, many departments are reluctant to assess the
old one. Some parts of the old program are, in fact, nearly impossible to assess. The new
program will be much more assessable, and as soon as the structure of the new program is in
place, general education assessment activity should expand greatly. In the meantime, the
limited assessment that is taking place in the old program will be useful to the GETGOs (General
Education Teams for Goal Oversight) that will be coordinating assessment of the new program.
Much of the current assessment activity is in core areas of general education, is fairly well
established, and is being used for improvement.
Assessment of Student Learning Centered Outside of the Five Academic Colleges
There is no standard format for annual reports from units outside the five academic colleges.
Submission of these reports is optional. The reports that are received are included in the
appendix of the institutional report. Below is a summary of assessment activities in units outside
of the academic colleges.
The Center for Continuing Studies now provides training in use of the Quality Matters rubric,
which is used to evaluate and improve online courses. In August 2008 an initial group of faculty
members were trained to use the rubric.
Undergraduate Studies is becoming more systematic and thorough in its approach to
assessment of student learning. It has developed an assessment plan for this year, and
collection of data on student learning has been ongoing in some units. For additional
information, see Appendix E.
5
Learning Resources and Technology Services has conducted assessment studies for a number of
years. The focus of these studies has moved increasingly toward student learning, although
student satisfaction still is the primary focus. The findings from these surveys have led to a
number of improvements in library operations. For additional information, see Appendix F.
Student Life and Development has developed division-wide learning outcomes. Departments
and programs within the division are developing learning outcomes and measures. The division
has undertaken efforts to communicate learning outcomes to students and to train staff
members in assessment basics. For additional information, see Appendix G.
Recent Accomplishments and Next Steps
Completion of assessment plans and reports is important, but it does not necessarily result in
improved student learning. Successful program-level assessment requires a support structure to
help faculty and staff members overcome these obstacles.
Effective assessment requires time for discussion, which may not be easily accommodated by
departmental routines. Even if time for such discussions is available, faculty and staff members
may have fears and misconceptions about assessment that make them reluctant to participate.
The Assessment Steering Committee has undertaken numerous initiatives to support the
program assessment efforts of departments and units at SCSU. Of course, the programs that
might benefit most from the available resources often don’t use them. The combined impact of
the initiatives has been substantial, but much work remains to be done. Below, recent
accomplishments of the Assessment Steering Committee are listed, followed by next steps. The
recent accomplishments were undertaken between fall 2007 and fall 2008. The list of “next
steps” includes initiatives approved but not yet completed by the Assessment Steering
Committee, as well as some areas that are in need of attention but for which there is not yet a
plan of action.
Recent Accomplishments
1) HLC Academy for Assessment of Student Learning – As noted throughout this report,
progress continues to be made towards the completion of our Academy goals. Additional
details follow.
2) Improved Reporting - Based upon input from faculty, college assessment directors, and
table discussions at the 2007 Assessment Luncheon, several changes were made in the
reporting template. For example, an open-ended question on assessment discussions was
replaced with a checklist, and the grid on the template was edited so that the information
requested focuses more clearly on specific student learning outcomes. These changes may
have contributed to the greater use of the template by reporting programs in 2008. Use of
the template facilitates aggregation of information from the program reports, which
contributes to greater accuracy of the information in this institutional report.
3) More Extensive Reporting - More accredited programs used the recommended template in
2008 than in 2007. This was especially true in the College of Education and in the Herberger
6
College of Business, both of which have college-wide accredited programs. Much of the
assessment work in these colleges is done at the college level. However, some of the
information for the annual program-level reports can only come from program faculty. The
assessment directors in these colleges must work with individual departments, while also
attending to college-wide priorities mandated by their accrediting bodies. This creates a
coordination and reporting challenge, and that challenge was met more effectively this year
than last.
4) Assessing Assessment - The College of Science and Engineering (COSE) Assessment
Committee developed a rubric that it is using to assess assessment. The rubric is based upon
the SCSU annual report template. The committee will provide feedback and suggestions to
programs in the college about their assessment plans and reports. This is a pilot program
that will expand to other colleges within the next year or two.
5) Assessment Grants - Based upon the reports submitted by the recipients, each of the 14
assessment grants that were awarded in 2007-08 had a positive impact on program
assessment. Appendix C summarizes the activities and likely impacts of each of the grants.
An important reason for the success of the assessment grant program is that the selection
criteria and reporting guidelines are fairly specific. Recipients of five of the 2007-08 grants
reported on the results of their projects in a panel session during Convocation Week in
August 2008. Recipients of four more of the grants presented in a panel session during
January 2009 Workshop Days. Assessment grants have proven to be a relatively inexpensive
way to promote quality assessment work.
6) Website - The Assessment website has been reorganized into a more user-friendly format. It
now includes an online handbook that explains assessment expectations on campus and
provides tips on how to fulfill them.
7) Walvoord Visit - The Assessment Steering Committee collaborated with the Center for
Excellence in Teaching and Learning to bring Barbara Walvoord to SCSU for two days in
January 2008. She presented workshops on program assessment, general education, and
grading. The combined attendance at these workshops was over 200 faculty and staff. The
workshops received high evaluation scores and, anecdotally, changed opinions (favorably)
of some toward assessment. Copies of Walvoord’s Assessment Clear and Simple were
provided to each attendee, and we have distributed additional copies of this book through
the Assessment Peer Consulting program and the Advancing Program Assessment through
Discussion Program.
8) APAD - Based upon the positive response to the Walvoord visit, the Assessment Steering
Committee established a new program, Advancing Program Assessment through Discussion
(APAD), which is designed to encourage discussions about assessment. This program
subsidizes the purchase of copies of Walvoord’s book, or an alternative book selected by the
program, to be used as a focus for program-level discussions of assessment.
9) Peer Consultants - A curriculum to train assessment peer consultants was developed and
implemented. An Assessment Academy mentor visited SCSU in May 2007 to facilitate a
“train the trainers” workshop that was attended by 20 persons representing key
constituencies in the institution. Six participants, who also were members of the Assessment
Steering Committee, designed the curriculum. Two cohorts of peer consultants have been
trained: 21 in January 2008 and 16 in October 2008. At the completion of the 11 hours of
required training, over 90% of the trainees reported that they were somewhat or very
7
confident in their abilities to articulate the basic principles of assessment, to present the
SCSU model of assessment, and to facilitate the assessment process in departments,
programs, and units.
An Assessment Peer Consulting Program was implemented. So far the program has served
five programs, one of them twice. Although the number of programs served is still small,
those using the service have been pleased with the results. Perhaps the most important
impact of the Assessment Peer Consulting Program to date has been on institutional
assessment capacity. The program has improved the assessment and consulting skills of the
peer consultants; it has improved assessment practices in the units that have received peer
consulting; and it has facilitated communication among faculty and staff members who have
an interest in assessment. Although new, the Assessment Peer Consulting Program has
begun to receive some national recognition. The core trainers for the Assessment Peer
Consulting Program will present a workshop on “Assessment Peer Consulting as an
Approach to Building Assessment Capacity” at the 2009 Assessment Academy Learning
Exchange and Showcase, which is part of the HLC annual meeting. A paper with the same
title will be published in the Collected Papers for the HLC annual meeting. In addition, a
reference to the program will appear in a book tentatively titled Principles and Profiles of
Good Practice in Assessment, which will be published in 2009.
10) Workshops - As in previous years, members of the Assessment Steering Committee offered
several workshops on assessment-related topics each semester. These workshops provide a
resource to departments and units that want to improve their assessment practices. Equally
important, they provide an opportunity to discuss assessment across department and unit
lines. Most sessions have been well attended and well received.
11) Assessment Luncheon - For the past several years the Assessment Steering Committee has
hosted an Assessment Luncheon. Members of assessment committees, assessment peer
consultants, department chairs, and administrators come together to talk about
assessment. Attendees are seated at round tables and provided discussion questions
pertaining to issues about which the Assessment Steering Committee would like feedback..
Other Activities
1) Annual assessment reports - Reports from programs remained at the college level in 2008,
while in 2007, the University Assessment Director received a copy. This change was
implemented in response to concerns among some faculty members that information in the
reports might be misused. Reports have always been intended for use by programs.
Institutional needs can be met through extraction of information from these reports, so
there is no need for the reports to be collected at the institutional level.
2) Deadline - Based upon input from faculty and college assessment directors, the deadline for
annual assessment reports was changed from April to September.
Next Steps
1) Improving Reporting - 2009-2010 is the last year of SCSU’s participation in the HLC
Assessment Academy. In order to approximate the goals for our primary Academy student
learning project, the rate at which program submit annual assessment reports must increase
8
substantially. Free-standing minors and BES programs may require special attention, due to
their low reporting rates in 2008.
2) Feed back to Departments - The pilot program in COSE to provide feedback to departments
on assessment plans and reports will be expanded to other colleges.
3) Newsletter - The Assessment Office recently began publication of a newsletter. Members of
the Assessment Steering Committee will play a major role in identifying useful content for
future issues. Some articles will spotlight SCSU programs and will be written by faculty and
staff members from those programs. It is hoped that the newsletter will stimulate
discussions of assessment across unit lines, as well as raise the profile of assessment in the
institution.
4) Institutional Learning Outcomes - In the spring of 2008, the academic action planning work
group on Institutional Outcomes recommended that SCSU establish institutional learning
outcomes consistent with those in the AAC&U’s Liberal Education and America’s Promise
(LEAP) project. A similar recommendation is included in the Academic Action Plan
Framework that resulted from the academic action planning process. The Assessment
Steering Committee supports this recommendation and will consider actions that will
contribute to its implementation.
5) CLA in the Classroom - In the spring of 2008, the Assessment Steering Committee and the
Office of Institutional Effectiveness began preparations to offer CLA in the Classroom
training at SCSU. This training, which will be provided by employees of CLA (Collegiate
Learning Assessment), is tentatively scheduled for May 2009. SCSU is using the CLA as part
of the Voluntary System of Accountability. The CLA is based upon written responses to
performance tasks that focus on real-world scenarios in which student must evaluate
multiple sources of evidence. CLA in the Classroom trains faculty members in the use of a
retired CLA performance task and scoring rubric and assists them in developing similar
materials for use in their own classes. Those who complete CLA in the Classroom training
are authorized to train others at their own institution at no cost, other than a nominal
charge for materials provided by CLA. Use of CLA in the Classroom has a substantial
potential to increase interest in and conversations about assessment, especially if a large
number of faculty members participate.
6) Recognition and Rewards – In the spring of 2008, the Assessment Steering Committee
began working on a position paper on rewards for assessment work. The position paper will
examine the reward structure for assessment work done by faculty and staff and to clarify
how and under what conditions assessment work at the program level contributes to
evaluation criteria specified in their collective bargaining agreements.
7) On-line Instruction - The Assessment Steering Committee so far has devoted little attention
to how the online and on-campus components of programs are integrated in program
assessment efforts. SCSU recently subscribed to Quality Matters, and a number of faculty
members have been trained to use the Quality Matters rubric to evaluate and improve
online courses. This is a useful resource at the course level, but it is not clear how this will
impact assessment at the program level.
8) Expanding Assessment - Learning Resources and Technology Services, Graduate Studies,
Undergraduate Studies, Continuing Studies, and Student Life and Development all are
represented on the Assessment Steering Committee and currently have assessment efforts
underway. However, some units within Academic Affairs have been relatively isolated from
9
assessment networks and, as a result, have less well developed assessment practices. These
units should be integrated more effectively into the institutional assessment structure.
Academy for the Assessment of Student Learning
SCSU was one of 13 institutions that were accepted into the first cohort of the Higher Learning
Commission’s Academy for the Assessment of Student Learning in 2006. Participation in the
Academy focuses on completion of student learning projects, which are developed over a fouryear period. An Academy “team” from the institution, with the assistance of Academy mentors
and feedback from cohort institutions, defines and plans the implementation of the project.
Project descriptions are placed on the “Electronic Network” website. The Academy team
submits updates about once per year. After each update, mentors provide feedback on
progress, and representatives from other institutions also may provide feedback. An Assessment
Academy Learning Exchange and Showcase at the HLC annual meeting provides an opportunity
for institutions to learn from each other through presentations on effective practices that have
emerged from Academy projects.
SCSU Student Learning Projects
SCSU has three student learning projects: University Assessment System, Assessment of Student
Learning in Programs, and General Education Assessment. The focus of SCSU’s efforts so far has
been on the first two of these projects, which are closely related. The University Assessment
System project seeks to improve and standardize the assessment reporting process and to use
reports to improve assessment practices. Assessment of Student Learning in Programs
implements the system plan at the level of programs. Each major and graduate program should
have an assessment plan and should document in an annual report how that plan is being
implemented. Assessment reports should “close the loop,” or in other words, demonstrate how
assessment findings are being used to improve student learning. SCSU has designated
Assessment of Student Learning in Programs as its primary student learning project.
The General Education Assessment project essentially has been dormant. The assessment plan
for the new General Education program was approved in 2006, but implementation of the plan
is not possible until more progress is made in approving the new curriculum.
Appendix D includes a list of goals and tasks for each of these projects, as well as a description
of how much progress has been made in completing these tasks. Although the General
Education Assessment project is behind schedule, Table D shows that the other two projects
have resulted in substantial improvements over the first two years of Academy participation.
According to Jodi Cressman, the HLC Assessment Academy mentor who reviewed SCSU’s
primary student learning project in December 2008, “This project seems to reflect real progress
in assessing learning in the majors and engaging faculty in assessment.”
Recent Activities Related to Student Learning Projects
This section summarizes how activities described in previous sections of this report support the
goals of SCSU’s Assessment Academy student learning projects. For a complete list of student
learning project goals, see Appendix D.
10
University Assessment System
Goal: Communicate institutional assessment policies
Activities: The reorganized Assessment website makes it easier for faculty and staff
members to find information about institutional policies, as well as to identify internal
and external resources that may be useful to their programs. A major goal of the
Assessment newsletter is to foster communication across unit lines by highlighting
assessment activities in specific programs that may be of general interest.
Goal: Build institutional assessment capacity
Activities: Many activities support this goal. Assessment grants and CLA in the
Classroom promote assessment capacity by directly involving faculty members in
assessment projects. The COSE Assessment Committee and soon the other college
assessment committees will review program assessment plans and reports and provide
feedback to departments. This process will build assessment capacity in both the
committees and in departments. The Assessment Peer Consulting Program directly
improves the assessment expertise of consultants through the training process. In
working with programs, consultants improve the assessment expertise of program
faculty and staff. This program also fosters assessment capacity by providing
opportunities for faculty and staff members to communicate about assessment across
disciplinary and unit boundaries. CETL workshops, the Assessment Luncheon,
institutional learning outcomes, and CLA in the Classroom also provide opportunities for
such discussions. The visit by Barbara Walvoord and the Advancing Program Assessment
through Discussion program promote discussions about assessment, but primarily at the
program, department, or unit level. The reorganized Assessment website makes it easier
for faculty and staff members to identify online resources to assist their program
assessment efforts. The Assessment Office also maintains a small assessment library
that is available to SCSU faculty and staff.
Goal: Integrate assessment into work life of institution
Activities: The Assessment Steering Committee has collaborated with the Center for
Excellence in Teaching and Learning to bring Barbara Walvoord to SCSU and to offer
assessment workshops on faculty development days. The Committee also is preparing a
position paper that will help to clarify where assessment work fits in the institutional
reward system. In addition, the Committee supports future initiatives to implement
institutional learning outcomes and to revise program review guidelines so program
review aligns more closely with annual assessment reports.
Goal: Implement assessment of assessment
Activities: The College of Science and Engineering Assessment Committee will review
2008 assessment reports from programs in that college and provide feedback to
departments on assessment plans and reports. This is a pilot project. Next year other
college committees will implement a similar review process.
Assessment of Student Learning in Programs
Goal: Implement institutional assessment plan at level of major programs
11
Activities: This is accomplished largely through the college assessment directors, who
work one-on-one or through their committees to assist programs with completion of
their assessment plans. The Assessment Office also offers several resources to assist
programs in completing or revising their assessment plans: assessment grants,
assessment peer consulting, and Advancing Program Assessment through Discussion.
Goal: Implement institutional assessment reporting system at program level
Activities: The college assessment directors, through their committees and one-on-one
exchanges with faculty members, play a central role in assuring that programs submit
annual reports and that the reports contain the information needed. One important
aspect of this role is to assist programs in coordinating requirements of accrediting
bodies with institutional reporting guidelines. The Colleges of Education and Business
made substantial progress with this in 2008, but the assessment directors in the other
colleges have successfully dealt with this challenge on a smaller scale. The feedback that
college committees will be providing to departments on their assessment reports will
help to improve the quality of those reports. Likewise, programs can use resources such
as assessment grants, assessment peer consulting, and Advancing Program Assessment
through Discussion to improve their assessment practices.
12
13
Appendix A. Reports from the College Assessment
Directors
College of Education by Elaine Ackerman
College of Fine Arts and Humanities by Wendy Bjorklund
College of Science and Engineering by Maria Womack
College of Social Sciences by Joseph Melcher
Herberger College of Business by Carol Gaumnitz
14
College of Education 2007-08 Assessment Report
Prepared by Elaine Ackerman, COE Assessment Director
Submitted October 1, 2008
Introduction/Background
The College of Education (COE) prepares future teachers, administrators, school
counselors, and other professional personnel at both the undergraduate and graduate
levels. Through the 63 programs within eight different academic units, students enjoy
many opportunities to pursue a variety of career paths in professional education and
service-related fields. Of these programs, approximately 95% are state or nationally
accredited. Slightly over 50% of programs in the COE are teacher licensure programs.
Thus, these 33 programs are dually accredited by the Minnesota Board of Teaching
(BOT) and the National Council for Accreditation of Teacher Educators (NCATE). Nine
additional programs licensing other school professionals are also accredited by NCATE.
Other entities accrediting programs in the COE include the Council for Accreditation of
Counseling and Related Educational Programs (CACREP), the Council for Professional
Education (COPE), the National Association for Sport and Physical Education (NASPE),
and Behavior Analyst Certification Board (BACB).
Description of the State of Assessment in the COE
The COE Assessment System was designed and implemented by the Assessment
Committee, Assessment Director, Associate Dean, and the Dean of COE. The
Assessment System can be found on the NCATE linkwithin the COE website. Each of
the eight units in the COE is represented on the Assessment Committee.
Much of the work completed by the Assessment Committee throughout the 2007-08
academic year revolved around the preparation for the Board of Teaching (BOT) and
National Council for the Accreditation of Teacher Education (NCATE) reaccreditation
visits in April 2008. In addition, two programs completed the process for continued
accreditation with their respective professional organizations and two programs
successfully completed the process for initial accreditation. Within the Department of
Counselor Education and Educational Psychology (CEEP), both the School Counseling
and the College Counseling and Student Development Programs successfully completed
the process for continuing accreditation from CACREP. Within the Department of
Education Leadership and Community Counseling (ELCP), the Community Counseling
Program was awarded initial accreditation from CACREP. Also, within the ELCP
department, the Marriage and Family Therapy Program was awarded initial accreditation
from the Commission on the Accreditation for Marriage and Family Therapy (CAMFT).
Those programs accredited by NCATE and most others operate on the following
principles: First, key personnel align state and national standards with programmatic
activities and formal assessments including traditional coursework and field and clinical
experiences. These alignment matrices include the coordination between knowledge,
15
skills, and dispositions, on the one hand, with assessment procedures on the other.
Another key element of the Assessment System is the establishment of transition points
within each program whereby students must demonstrate that they are making adequate
progress to move forward within programs. For example, a student is not allowed to enter
a clinical experience unless they are able to demonstrate the necessary knowledge, skills,
and dispositions required for field work.
As part of accreditation processes in the College, considerable effort was expended over
the past three years to formalizing the assessment and reporting process. To this end, a
set of follow-up studies were conducted, analyzed, and disseminated. These reports are
disaggregated by program and all include assessment of student learning outcomes,
especially the “performance based” investigation. The following data are collected on a
systematic basis:
 Candidate self report: A follow-up study collected each semester and
disseminated every other year.
 Cooperating teacher study: Collected each semester, reported bi-yearly.
 Performance based assessment: Data on clinical experiences are collected
annually and disseminated every other year.
 Two-to-five year follow-up study: This is a random survey of candidates
who have completed degrees and who have been gone from the institution
for a period of years.
 Unit operations assessment: This instrument allows candidates to assess
the operations of the unit. In other words, the unit’s performance in
support of student learning outcomes. Data are collected annually.
 PRAXIS Data: All program completers in the education unit complete
national exams related to basic academic skills, content, content pedagogy
and pedagogy. These data are collected and disseminated annually.
One of the aspects of our Assessment System in which we take great pride is the system
for documenting that departments are utilizing the data for reforming programs. When a
report is disseminated to a department or program, department chairs or coordinators are
asked to complete a form reporting how they used the data to make changes in the
program, unit, or curriculum. On an annual basis, we summarize the returns of the
instrument in a report.
The COE assessment goals for the 2008-09 academic year are:
 Work with department chairs and assessment coordinators to encourage
the pursuit of national recognition for their programs. This effort will be
coordinated by Kate Steffens, COE Dean, and Elaine Ackerman,
Assessment Director. The goal for this academic year is the submission
of one program application for national recognition.
 Design and implement a plan for creating a qualitative component within
16
the existing Assessment System. As a part of the unit-wide assessment
practices, a number of instruments have been developed, data collected,
analyzed and results disaggregated and disseminated to programs. The
instrument most directly measuring student learning outcomes (SLOs) is
the performance-based instrument administered to cooperating teachers
and university supervisors. Since these methods are currently designed
around quantitative elements, the Assessment Committee has proposed a
qualitative component to enrich and deepen the data now collected. This
year, the plan will be designed and implemented.
 Develop a method for accumulating and tracking performance data, such
as teacher work samples, portfolio outcomes, and various other examples
of student learning outcomes at the undergraduate level.
 In collaboration with the COE graduate coordinators, design and
implement a graduate database to track candidate performance related to
learning outcomes.
Analysis of Progress
As is indicated from Part I of the COE University Assessment Grid, most departments
have submitted an assessment report. All reporting programs utilized the form/template
recommended by the University Assessment Steering Committee. This in itself was a
tremendous improvement over last year’s reporting.
Most programs have identified student learning outcomes as an integral part of their
assessment plan. All programs have indicated and in most cases, provided evidence of
both direct and indirect measures of student learning outcomes. As also indicated on the
COE University Assessment Grid, many programs use other measures to assess program
effectiveness and progress such as end-of-program satisfaction surveys and employer
surveys.
Regarding tangible data on student learning outcome measures, as noted on Part II of the
COE Program Matrix, there are several programs that have not yet collected data on their
student learning outcome measures. Or, the data have not yet been organized and
analyzed in a manner to allow the tracking of individual student performance. The Dean
of the College is making this work a priority and will ensure that all programs are
collecting data on student learning outcomes. Much progress has been demonstrated and
programs are taking steps to move forward with their work. For example, faculty
members within the Community Psychology Program, one of the largest undergraduate
majors at SCSU, have just formalized the student learning outcomes for their program.
Thus, as indicated on the matrix, they will begin their data collection this year.
According to Part II of the University Assessment Report Matrix, all of the reporting
programs indicate that they have conducted program discussions regarding assessment on
at least an informal basis. Most programs have discussed assessment issues as an agenda
item in regular department meetings. Through the work of the Assessment Committee,
17
this process will be come more formalized.
The COE Assessment Matrix reveals that a preponderance of programs are using data
regarding student learning outcomes to make improvements. Listed below are several
examples of program goals identified on the basis of data collected in the College
(programs not named specifically).
● The program coordinators, graduate coordinators, and department chair
will be responsible for seeing that the data collected in courses at the
individual level (identified via the extant portfolio document), will be
collected by instructors, cumulated, and reported annually.
● The program will engage in an alignment process in the following areas:
○ Program alignment between program outcomes with program
description and mission
○ Course syllabi and program outcomes
○ Course content across sections of the same course
○ Course content within program outcomes.
● The program plans to conduct a review of where in the curriculum each
competency is assessed. When that is complete, data will be gathered on
each competency. For example, Competency #1—“Knowledge of what
leadership is, how it has been distinguished from administration, and the
ability to develop a practical and personally useful definition of
leadership”. The assessment will occur across the program curriculum—
featuring alignment of competencies with course outcomes and course
/student assessments.
Note From the Assessment Director: As the COE Assessment Director, I would like to
see programs in the College of Education routinely use assessment and data as key
components in determining instructional strategies to enhance teaching effectiveness to
ensure that students gain the necessary knowledge, skills, and dispositions to be
successful. We are making strides toward that goal as is evident from the University
Assessment Grid and this summary report.
18
College of Fine Arts and Humanities 2007-2008
Assessment Report
Prepared by: Wendy Bjorklund, College Assessment Coordinator
Submitted October 1, 2008
During the past school year, assessment activity increased in the College of Fine
Arts and Humanities. I received year-end assessment reports from every department in
the college. Not every program in every department was assessed, but each department
assessed some aspect of student learning in at least one program. This represents a
substantial commitment to the work of assessment. Often that work is accomplished
through the efforts of a single individual within a department, with few accompanying
rewards. In other cases, departments have entire committees devoted to gathering and
analyzing assessment data.
Several of the departments in the college have accrediting bodies that dictate their
assessment practices. It is not always possible to align perfectly the assessment necessary
for accreditation with the assessment of student learning as defined by St. Cloud State
University. That said, based on what has been reported this year, department assessment
coordinators have been quite successful in drawing relevant assessment data from their
accreditation processes and connecting that data to student learning. In only a few cases
are student learning outcomes more teacher centered than focused on what students will
learn in a program, and only a handful of measures in use are so indirect that it is difficult
to tell whether they are measuring the intended outcome.
Most of the departments’ assessment efforts have focused on established student
learning outcomes for their departments. In a few cases, however, the focus has been on
establishing better assessment practices. For example, student learning outcomes and
measures have been created for the Music MA program, so this year assessment can be
conducted in that program. For the past year, Music also has focused on improving
advising processes for their students, which are sure to promote students’ success in their
programs. These types of improvement, though difficult to connect directly to student
learning outcomes, are important to recognize.
Eight of the nine departments used the template devised by the Assessment
Steering Committee for all or most of their year-end reports. In those instances when a
department elected not to use the template, it was not clear whether assessment results
had been discussed within the department or whether any changes would be made to the
department’s curriculum, pedagogy or assessment practices based on those results. Of
those departments that discussed their assessment results, most report proposed changes
based on the data. These changes are evenly divided between curriculum/pedagogy and
assessment practices. Assessment discussions occur both formally and informally, often
in multiple venues. It is encouraging that most departments seem to be talking about
assessment, sharing data and making improvements to their programs or practices as a
result.
19
One of the pieces of information that is difficult to derive from the reports is
whether departments have implemented any proposed changes to date, and whether they
have collected data that demonstrate whether student learning has improved as a result.
This may reflect a problem with the template and how it requests that information. In
only a few instances has a department reported a change based on assessment data to a
program or assessment practice, and no new data collection assessing the effects of that
change has been reported.
Most departments report using direct measures for assessing student learning
outcomes, while a few report using a combination of both direct and indirect measures.
Just two programs use only indirect measures of student learning. This increases the
validity of the college’s assessment efforts as direct measures of student learning are
more likely to measure what they are intended to measure. The indirect measures,
however, also provide insight into student learning and are useful.
Over half of the departments conducted and reported on UDW assessment.
Several of these departments are making changes to the rubrics used for UDW
assessment and/or to the manner in which they have been using those rubrics. I anticipate
the percentage of programs reporting on UDW assessment processes to increase each
year. I also received assessment reports on courses that appear in the general education
program from over half of the departments in the college. Given that the current general
education program is being reconfigured, I was surprised and pleased to see the
commitment demonstrated to assessing the student learning in this program. By contrast,
assessment was conducted in only about 25% of the programs offering a BES degree.
This presents an opportunity for improvement over the next year.
Overall, the assessment reports submitted for 2007/2008 are excellent. Most
departments have assessment plans in place for their programs. They may not conduct
assessment in every program every year, but every year, they conduct assessment of
student learning in at least one program. In addition, there has been an increase in the
assessment of student learning in UDW courses, and in the courses that are part of the
general education program. Faculty members are discussing assessment data, and
improvements are being made to programs and assessment practices as a result of those
conversations. I would like to see more assessment being conducted after these changes
are implemented in order to test their effectiveness. I also would like to see assessment
more uniformly supported across the departments of the college. In some cases,
individual faculty members have become pretty overwhelmed and frustrated by this
work, despite the fact that much of it is “good work.”
As I assembled this report, I also realized that the assessment plans for the various
programs in the College of Fine Arts and Humanities posted on the assessment website
are incomplete. This is not because the plans have not been completed (I am receiving
assessment reports based on those plans!), but because they somehow did not reach the
people who can get them posted and updated. One of my goals for this school year is to
rectify that problem.
20
College of Science and Engineering 2007-08 Assessment Report
Prepared by Maria Womack, COSE Assessment Director
Submitted October 1, 2008
The COSE Assessment Committee is a dedicated group of individuals,
representing the twelve departments in the college. Each spring this committee has set
goals for the coming academic year, and each year the goals have been met.
The goals for the 2007-08 COSE Assessment Committee were:
1. Complete program assessment plans for all departments
2. Interpret and disseminate COSE engagement survey results
3. Monitor general education assessment
4. Continue to close the loops
5. Develop COSE grid for tracking department assessment progress
6. Streamline electronic report submission procedures
The COSE assessment effort has had full support of the Dean’s office, and the COSE
Assessment Committee has aligned all departmental missions with the system, university
and college missions. In addition, the committee has reviewed and posted all
departmental goals and student learning outcomes for all undergraduate majors, and
begun posting matrices and timelines online. See:
http://www.stcloudstate.edu/cose/college/assessment/default.asp
Goals for the 2008-09 COSE Assessment Committee were approved by the committee at
the end of spring semester 2008:
1.
2.
3.
4.
5.
Continue to close the loops
Monitor general education assessment
Implement COSE grid for tracking assessment progress
Implement electronic report submission procedures
Assess student learning in service courses
The COSE departments continue to vary in their stages of assessment, but with less of a
spread than in previous years, as all departments now have assessable Missions, Student
Learning Outcomes and most have timelines and matrices worked out. This is a big
improvement over several years ago, when several departments still did not have
assessable SLO’s for their major programs. Some overview of the college’s assessment
programs are below.
1. The Department of Nursing Science built assessment of student learning into
its program design at every stage, and assesses all SLO’s every year. As
evidenced in their current report, student learning in each course is
documented and assessed. Alumni survey results are given for 3 years.
Furthermore, activities across the program are assessed with common rubrics,
and decisions are made on the basis of data gathered.
21
2. Three departments have their programs accredited by ABET, Accreditation
Board for Engineering and Technology. They are Computer Science,
Electrical and Computer Engineering, and Mechanical and Manufacturing
Engineering. These programs have detailed assessment plans and have all
been “around the assessment loop” numerous times. Their accrediting agency
defines the format of their student learning outcomes, departmental goals and
annual reports. They continue to make program improvements based on
assessment data. This last year they separated CE from EE assessment, as per
ABET. Electrical engineering received reaccreditation, and Computer
Engineering is newly accredited until 2010.
3. The Department of Environmental and Technological Studies has a long
assessment history as well as nationally accredited programs. This
department has well defined assessment plans for its majors and continues to
make improvements based on assessment data.
4. The Aviation Department is re-writing its assessment plan, based, in part, by
requirements of its accrediting agency. They are in the process of
implementing a newly coordinated assessment of student learning system.
5. Six departments - the Departments of Biological Sciences, Chemistry, Earth
and Atmospheric Sciences, Environmental and Technological Studies,
Mathematics, and Physics, Astronomy and Engineering Sciences – have
education majors that must meet the standards of NCATE, the National
Council for the Accreditation of Teacher Education, as well as the
requirements of the Minnesota Board of Teaching. All were up for, and
received, accreditation of NCATE. It has been a challenge to incorporate these
standards and requirements into the college assessment of student learning
report. The COSE Assessment Committee continues to improve assessment
for those majors without unnecessary duplication of effort.
6. The other major programs in five of these departments - the Departments of
Biological Sciences, Chemistry, Earth and Atmospheric Sciences,
Mathematics, and Physics, Astronomy and Engineering Sciences – as well as
the major programs in the Department of Statistics & Computer Networking
Applications, are in the earlier stages of assessment. All are gathering data
related to some student learning outcome and have made some improvements
based on that data. Results from a survey of graduates were used by the
C.N.A. department to introduce more programming into the courses and make
the classroom more interactive with more hands-on activities. EAS did a
major overhaul of their UDWR, tested it and modified the plan. Physics
expanded the assessment plans for the RadTech and NucMedTech programs,
introducing new examination and grading techniques and hiring a new
director for that program.
7. It’s worth noting that the majority of Assessment grants awarded went to
faculty in the College of Science and Engineering. These grants focused on
implementing assessment programs.
22
A breakdown of the reporting details received from departments shows that while
all departments submitted an assessment report for their majors, none submitted
any reports for free-standing minors. There is still confusion about the need to
submit those in the first place, which the college committee will address next
year. Half of the departments submitted an assessment report for their Upper
Division Writing Requirement, but several departments noted that they did not
assess that every year. Assessment reports were received for 27% of the
departments, with some hesitation about what to use for SLO’s because the
general education assessment programs were being reviewed at the university
level last year. Most departments indicated that they would assess general ed
again once new standards were approved and in place.
Program
Number of Departments
submitting at least 1 report
Percentage of departments
submitting at least 1 report
(100% max)
Majors
Free-standing minors
12
0
100%
0
Master’s
General ed
1
3
14%
27%
UDWR
6
50%
New templates were suggested for use with the assessment reports and 10/12
departments used them.
A pilot program known as “Tracking Assessment”, was begun. The committee
discussed how this might proceed and drafted forms to be used for next year. The
forms will be used by a subcommittee of the COSE Assessment Committee to
review submitted reports, and recommend improvements to the reporting process
and/or the assessment plan. This will be implemented in 2008-09 and COSE will
be the first college at SCSU to do this.
An overview of materials submitted from departments is given in two grids
below. As noted above, some trends for the college are:

100% of departments submitted at least one report. This appears to be a
minor improvement over last year’s already high submission rate.
23






0% of departments submitted reports for free-standing minors.
50% of departments submitting reports for UDWR (but this is not assessed
every year)
27% of departments submitting reports for general ed – many departments
stated that they were waiting until general ed SLO’s were rewritten and
approved before they assessed it in their department.
~100% of all assessment reported was from the SLO’s that are part of the
department’s assessment plan.
All departments used direct measures of assessments and 5/12
departments used indirect measures, such as surveys of current or former
students.
Findings were reported for ~50% of the SLO’s measured. Many
departments either are still collecting data, but not at the conclusion stage,
or claimed to have findings, but didn’t give the details in their reports.
24
College of Social Sciences 2007-08 Assessment Report
By Joe Melcher, COSS Assessment Coordinator
Submitted October 1, 2008
During AY 2007-2008 COSS programs made significant additional progress
toward developing systematic assessment. Almost all COSS programs have at least parts
of their program assessment plans in place. This was our best year ever for receiving
annual program assessment reports: 10 of 13 departments submitted reports based on
assessment in at least one program. Twenty-four of 56 (43%) individual programs
submitted reports (up from only one in 2005-006) and all used the ASC-recommended
program report template. The vast majority of non-submitting programs involve BES
degrees (for more, see below).
Assessment Strengths in 2007-2008
Progress in developing assessment plans
COSS programs made significant additional progress in completing assessment plans and
posting them to the University Assessment Office website. Only two regular bachelors
programs have not yet formulated Missions. Both of these relate to the Social Science
degree programs, which are undergoing significant reorganization in connection with a
new permanent director and because of their special relationship to the College of
Education and NCATE accreditation. The only other programs without Mission
statements are BES programs linked to regular bachelors programs. COSS also made
great progress in posting the program matrix and timeline components of assessment
plans, with completion rates rising from 35% to 57%, and 17% to 47%, respectively).
Progress on doing assessment and submitting assessment reports
In addition to the improvement in the number of programs submitting reports, the
quality of assessment measures also continued to improve. Only two programs reported
using only indirect measures; all others direct measures.
The number of graduate programs submitting reports increased from 2 to 5 (out of
11).
Several programs, (most notably Sociology, Social Work, Gerontology, and
Psychology) did some really good program and/or course assessment that focused well on
their student learning outcomes using a variety of direct measures. Psychology also
included results from a nationally normed test taken by graduating seniors, and results
from a recently-developed graduating senior exit survey. They also included some firstpass comparisons of student learning outcomes that compared on ground and online
courses.
Almost all programs that submitted assessment reports also indicated at least
some discussion of the results.
Upper division writing requirement assessment
The number of programs submitting UDWR assessment reports rose from one last
year to four this year.
25
Assessment Weaknesses in 2007-2008
Assessment plans
Despite clear progress, there are still 32 programs that need to complete
assessment plans. This will be a priority during 2008-2009. Although this number seems
high, it does include 8 minor-only and 5 BES programs. Not counting these, 12 regular
bachelors programs have incomplete assessment plans.
Assessment of BES programs has been problematic simply because none of them
have been assessed. The (very understandable) problem is that nobody wants to
separately assess BES programs, particularly because they typically have no more than a
handful of students per year (and sometimes, none!). Several programs have solved the
problem by eliminating the BES option, but at least a couple are reluctant to do so,
despite the small number of students. The simplest solution would be to eliminate the
requirement that BES programs be assessed separately (based on the assumption that
BES students mostly take the same courses as regular majors (or minors, in the case of
the Ethnic Studies BES options).
Doing assessment
Although there was good improvement this year, both in terms of the number of
and quality of assessments, there were still a number of programs that did not submit
reports. And even within programs in which some assessment is taking place, there are
typically many faculty who have not contributed. As one way to encourage more faculty
to participate, I have spoken with Dean Harrold about the desirability of encouraging
faculty to submit direct measures of assessment as evidence of teaching effectiveness
during the EPT process. He seemed supportive, and I will encourage him to follow up on
that as he meets with faculty during the Article 22/25 process. Ideally, direct assessment
of student learning, combined with actions based on results, will eventually predominate
over the old method of using student course evaluations. Department Chairs and EPT
committees should also be encouraged to emphasize that course assessments based on
student learning outcomes are more beneficial than student course evaluations.
Although there was a solid increase in the number of UDWR assessment reports,
clearly, more programs need to begin submitting these.
Assessment quality
The quality of course and program assessment ranged from excellent to marginal.
The most marginal assessments were based solely on one indirect measure of students'
opinion about their learning. Some programs/faculty clearly need more guidance and
assistance in selecting direct measures to assess the program learning outcomes. Now that
these programs are clearly identified, I will encourage them to take advantage of the Peer
Consulting program.
Closing the loop
Another area that needs work lies with documenting implementation of changes
based upon the results of assessments. None of the 19 reporting programs were able to
document changes based upon assessment results, either at course-or program-levels
(although some indicated that changes are planned). Similarly, no programs were able to
report having collected data on the last round of changes (if any). I think that part of the
26
reason for this is sheer lack of time and the logistic difficulty of going back to attempt to
collect data on previously assessed learning outcomes while simultaneously gearing up to
assess the next set of outcomes in the schedule. This is an issue which the Assessment
Steering Committee probably needs to discuss.
COSS Assessment Goals for 2008 – 2009
* 100% of bachelors programs will have complete assessment plans.
* 100% of BES programs will have complete assessment plans (or be eliminated).
* Improve the quality of assessment by helping faculty and/or programs do more direct,
rather than indirect assessment. Assessment Peer Consultants may be able to help
with this.
* Increase the number of programs submitting UDWR assessments.
* Work with the Geography and Ethnic Studies programs to help them finalize their
assessment plans and to do assessment work this year.
* Increase the number of faculty within programs who contribute to the program
assessment plan. In several cases, most of the effort seems to be coming from one
or two faculty members.
27
G. R. Herberger College of Business 2007-2008 Assessment Report
Prepared by Carol Gaumnitz, HCOB Assessment Director
Submitted October 1, 2008
Introduction
The G. R. Herberger College of Business (HCOB) has been accredited by AACSB for
over 30 years. In 2003, the AACSB accreditation standards moved from accrediting
business schools based on the strength of their resources and reputation to a new missionbased philosophy. With student learning fundamental to all business schools, assessment
of student learning is critical for continuing accreditation. The major difference in
assessment of student learning for accreditation and university reporting is in the
definition of “program.” For AACSB accreditation, the college has two programs, the
bachelors degree and the masters degree, while the university requests reports for all
major and free-standing minor programs.
The HCOB Assessment Steering Task Force (HCOB Task Force) and Assessment
Director Kerry Marrer directed the college’s assessment activities for 2007-2008. Since
Kerry is not a member of the faculty, Carol Gaumnitz was elected to represent the HCOB
on the University Assessment Steering Committee.
Assessment Activities for BS Program
The HCOB has five student learning goals for undergraduate business majors.
(1)
(2)
(3)
(4)
(5)
Our students will be effective written and oral communicators.
Our students will be competent problem solvers.
Our students will be effective collaborators.
Our students will be competent in the business core.
Our students will be competent in their respective disciplines/majors.
Learning objectives were written for each goal, and a course matrix and assessment
timeline were prepared. These were reviewed and approved by all departments during
Fall Semester 2007. Competency in a major is assessed at the department level.
Departments developed and approved learning objectives (goals) for all majors. A course
matrix and assessment timeline were prepared for all but the entrepreneurship and
international business majors. These majors are currently being reviewed by the faculty.
Copies of the goals and objectives for the HCOB and its majors are attached to this
report.
The assessment activities conducted in 2007-2008 are summarized below by college goal.
To assess the last goal, competence in the student’s major, each department was asked by
the HCOB Task Force to assess at least one of their objectives during Spring Semester
28
2008. The results of these and any other activities conducted by the department are
discussed below by major.
29
Written Communication
During the 2007-2008 academic year, assessment of written communication was a major
focus. A common HCOB written communication rubric was adopted, except for the
content portion, which was customized for each major. Additional samples of student
writing were gathered from students across majors. An outside writing expert (with a
master’s degree in English) was employed for evaluation of writing mechanics. Results
from the review of 333 writing samples showed that 58% of the students’ writing was
fair, very good, or excellent. The remaining 42% were rated as poor or failing.
Additional writing samples have been evaluated but not summarized. Results on these
samples are similar. Even though the expert’s evaluations are very tough, the HCOB
Task Force found these results unacceptable. Recommendations for improving student
writing were formulated by the HCOB Task Force and forwarded to the Dean and
Executive Committee. Informational meetings for HCOB faculty were also held to
solicit faculty input on the recommendations.
Some of the recommendations will be implemented Fall Semester 2008. Short-term
changes include the addition of a revise/resubmit writing assignment in COB 111 using
the HCOB written communication rubric. This course is an introduction to the HCOB
and potential business majors. COB 111 will now include an emphasis on the importance
of writing for business majors. Another quick remedial action included the creation of
short punctuation and mechanics of writing guides. These guides will be made available
electronically to faculty in Fall Semester 2008. The guides may be put on D2L for
courses with writing assignments. It is hoped that these concise guides will help students
improve their own writing. Continuing support of an English expert to help faculty
evaluate student writing was requested and granted. Long-term curriculum changes
continue to be considered. Overall, the actions can be summarized as placing a greater
importance on writing in the HCOB.
Plans for implementing the Upper Division Writing Requirement (UDWR) were
discussed extensively by the HCOB Task Force. Members went back to their respective
departments for further discussions. The designated courses were chosen for most
majors. Several departments piloted the UDWR in their selected courses during Spring
Semester 2008. The HCOB Task Force decided to combine assessment of writing and
the UDWR. Papers from the designated courses were collected for assessment by the
outside writing expert. This was the source of most of the papers discussed above.
Problem Solving
Discussion took place on how to assess problem solving. The course in which problem
solving may be assessed has not been identified. Assessment of problem solving is still
in the early planning stage.
Collaboration
30
Assessment of collaboration was discussed during 2007-2008. Surveys were developed
to survey faculty and students about the extent of collaboration in HCOB courses. These
surveys will be administered at the beginning of Fall Semester 2008. These surveys will
help establish a baseline of where collaboration is being used and students’ perception of
collaboration in their coursework. From this starting place, assessment activities will be
designed in 2008-2009.
Business Core
Assessment of competence in the business core was planned and implemented in 20072008. “ETS Major Field Test in Business” was selected as the means of assessing
competency in the business core. In addition to an overall score, results will be provided
by major topics. The ETS exam was administered during Spring Semester 2008 and will
be administered during Summer Session 2008. When the summer results are received in
Fall Semester 2008, the results will be shared with the HCOB faculty. For evaluation
purposes, ETS provides data on score distribution for all individuals from all schools
taking the exam, as well as institutional means and score distributions. The exam has
been given to over 37,000 individuals at 447 institutions. At SCSU, the exam was
administered to 171 students enrolled in MGMT 497, the HCOB capstone course.
Accounting Major
During Fall Semester 2007, student learning objectives on course syllabi were reviewed
at a faculty meeting. Measurability was discussed, and a copy of Bloom’s taxonomy was
distributed. Suggestions were shared for improving course learning objectives. The
importance of including student learning objectives on the syllabi for all required courses
in the accounting major was emphasized. In addition, writing and ethics were assessed in
Spring Semester 2008. Business taxation and critical thinking are scheduled for
assessment next year.
The results from assessment of “ethical issues facing the profession” in ACCT 486
suggested that, when given an ethics case, students could identify the issues, effected
individuals, and possible actions. Students did not, however, score well when tested on
the content of the AICPA code of ethics for auditors. This suggests that more time
should be spent on the code of ethics in the auditing course.
Assessment of written communication was combined with a pilot of the UDWR. During
Fall Semester 2007, the department faculty discussed the UDWR and chose a course in
which to focus on professional correspondence. During Spring Semester 2008, case
papers were assigned in ACCT 382. The papers were also given to the college’s writing
expert for evaluation of writing mechanics. The papers were scored using the HCOB
written communication rubric. The rubric scores content, organization, expression, and
mechanics on a three point scale. A score of 1 or “not good enough” in any one area
indicates a paper that is not acceptable. The results from papers collected from 71
accounting majors showed 73% of accounting majors scoring “good enough” or “better
31
than good enough.” Thus, only 27% of the students had one or more scores of “not good
enough.” More emphasis will be placed on writing style and mechanics next year.
Business Computer Information Systems Major
In BCIS 350, assessment of “apply system concepts for framing and understanding
problems” was conducted. A rubric needed to be developed and reviewed for this
objective. Questions embedded in an exam were used for the assessment. The results
will be discussed as an agenda item at a department meeting Fall Semester 2008.
Assessment of written communication and “computer programming for creating
information systems applications” is planned for next year. Discussions have already
begun on how to assess the major objective.
Finance Major
Assessment of “application of finance knowledge to real-world problems” was assessed
during Spring Semester 2008. Students were given an essay question that asked them to
give a detailed plan for investing retirement contributions. The results were scored from
1 to 3, with 3 being high. The averages were 2.56 and 2.65 in two sections of FIRE 373.
These results were shared through informal conversations among faculty members.
Each of the three majors in this department will include an assessment activity in Fall
Semester 2008. “Valuations techniques” will be assessed in the finance major,
“conducting effective research in insurance related areas” will be assessed in the
insurance major, and a “real estate investment analysis” will be assessed in the real estate
major.
Management Major
“Identify and define human resource activities and their role in organizations” was the
goal assessed during 2007-2008. Two faculty members collaborated to design an
instrument to assess student understanding of various core concepts in human resources.
The instrument was given to 73 students in two sections of MGMT 352. In general,
students received their highest ratings on their knowledge of employment law. Scoring
on workplace safety, unions/labor relations, job analysis, and global human resources
were all within acceptable ranges. Students had unacceptable scores in their
understanding of the current market pricing of jobs. The investigators discussed
improvements that can be made where student understanding was weak. In particular,
they agreed that further efforts need to be made to aid student comprehension of job
pricing and the current constraints on setting wage and salary grades. Wage and salary
surveys will be sought out for use in class.
For 2008-2009, the management department wants to close the loop in MGMT 352 by
making the suggested changes. The department will also plan and administer assessment
of “management practices across cultures and countries” and the “structure, processes,
and outcomes of organizations.”
32
Marketing Major
Assessment of “delivery of professional presentation” was conducted during Spring
Semester 2008. Each student in MKTG 415 was required to create, refine, and present a
professional sales presentation. The presentations were videotaped and evaluated. Each
student was given feedback on his/her performance in the form of a scored rubric. All
students passed with scores varying between 75% and 100% of the available points for
the project. Other faculty members who have taught, or are currently teaching sales or
sales management, were given a draft of the rubric for suggestions and improvements. It
was suggested that the process could be improved by making each student’s videotaped
presentation available for his/her review.
Assessment in the marketing major for next year will relate to the “strategic marketing
process to solve marketing problems.” Assessment of the “situation facing the decision
maker (culminating in a SWOT chart)” will be conducted in MKTG 429.
Assessment Activities for MBA Program
The HCOB has three goals for its MBA program.
(1) MBA graduates will be professional communicators.
(2) MBA graduates will be effective decision makers.
(3) MBA graduates will be leadership oriented.
Learning objectives were written for each goal, and a course matrix was prepared. An
assessment timeline still needs to be prepared. The goals and objectives were approved
by the HCOB Graduate Committee and Executive Committee.
Professional Communicators
Assessment of writing in the MBA program began Spring Semester 2008. This is an
ongoing assessment across multiple courses. Results will be summarized and analyzed
next year.
Effective Decision Makers
One of the learning objectives under this goal is that students will “apply global
perspectives to business situations.” This objective was assessed in MBA 675. Most
students received acceptable ratings. It was noted that students received this rating
(rather than exemplary) due to lack of depth of analysis or lack of elaboration, not
because of inaccuracies in the analysis. The management faculty found that the rubric
used to evaluate global perspectives was fairly simple to use. Relatively good inter-rater
reliability was observed. They did note, however, that instructors who taught the course
and applied the rubric to their own students were looking for more specific answers than
33
instructors who applied the rubric to someone else’s students. Suggestions were made for
improving the rubric for future use.
Conclusion
Assessment in the HCOB received a major overhaul in 2007-2008. The HCOB Task
Force met weekly and had many lengthy discussions on how to accomplish the
assessment activities needed to measure our student learning goals. But much remains to
be done. Our biggest concern is how to keep the process moving forward. The
foundation of goals and objectives are agreed upon, rubrics have been created or adapted,
timelines have been prepared, and assessment activities have been conducted. Our
concern is that assessment is still not an ingrained habit for many HCOB faculty
members. Gentle reminders and nudges will still be needed, at least in the near future.
34
35
Appendix B. Summary Information on Academic Programs to Accompany
Appendix A
College of Education by Elaine Ackerman
College of Fine Arts and Humanities by Wendy Bjorklund
College of Science and Engineering by Maria Womack
College of Social Sciences by Joseph Melcher
Herberger College of Business by Carol Gaumnitz
36
College of Education
Department
Center for Information
Media (CIM)
Assessment
Discussions
This Year
Changes
Proposed
This
Year
Based on
Data
Changes
Implemented
This Year
X
X
X
X
X
X
X
X
X
X
assessed as
part of MS
Early
Childhood
ED
X
X
X
X
MS - Early Childhood
Education
X
X
X
X
X
X
MS - Early Childhood
Special Education
X
X
X
X
MS - Family Studies
X
MS - Early Education
X
X
X
X
X
X
X
X
X
X
X
Program
BS - Information Media
MS - Information Media
Child and Family
Studies (CFS)
BS Early Childhood
Education
GC Child and Family
Studies
Counselor Education
and Educational
Psychology (CEEP)
Report
Submitted
SLOs
Assessed
Come from
Online
Assessment
Plan
Findings
Reported
on SLO
Measures
X
X
X
Data Collected
on
Implementation
of Changes
X
MS - School Counseling
X
MS - College Counseling
and Student Development
X
X
37
Educational
Leadership and
Community
Psychology (ELCP)
MS - Counseling
Psychology: Rehabilitation
X
MS - Higher Education
Administration
X
Ed D - Higher Education
Administration
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
BES- Community
Psychology
BS - Chemical
Dependency
BS - Community
Psychology
Certificate Chemical
Dependency
assessed as
part of BS
Chem. Dep.
GC Marriage and Family
Therapy
assessed as
part of MS
Marr & Fam
X
X
X
X
X
X
X
MS Community Education
X
X
X
X
MS - Counseling
Psychology: Community
Counseling
MS - Marriage and Family
Therapy
X
X
X
X
X
X
X
X
X
X
MS - Applied Behavior
Analysis
X
X
X
X
MS - Educational
Administration and
Leadership
X
X
X
X
X
X
X
X
X
38
Spec. - Educational
Administration and
Leadership
Health, Physical
Education, Recreation
and Sport Science
X
X
X
X
X
X
BES- Physical Education
(non teaching)
X
X
X
BS - Athletic Training
X
BS - Community Health
X
BS - Physical Education
X
BES- Health Education
X
MS - Exercise Science
X
MS- Physical Education
X
MS - Sports Management
X
X
X
X
X
X
BS - Recreation and Sports
Management
Human Relations and
Multicultural
Education (HURL)
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Minor- Human Relations
(17 credits)
Minor- Human Relations
(24 credits)
MS - Social Responsibility
Special Education
(SPED)
BS - Special Education
GC- Developmental
Disabilities
X
X
X
X
assessed as
part of MS
SPED
39
Teacher Development
(TDEV)
GC- Emotional Behavioral
Disorders
assessed as
part of MS
SPED
GC- Learning Disabilities
assessed as
part of MS
SPED
GC- Physical/Health
Disabilities
assessed as
part of MS
SPED
GC- Autism
assessed as
part of MS
SPED
MS - Special Education
X
X
X
X
BS - Elementary/ K-8
Education
X
X
X
X
GC- Reading Teacher K-12
X
X
X
X
GC- Teacher Leader
X
X
X
MS - Curriculum and
Instruction
X
X
X
X
X
X
X
Upper division writing assessment reports received from these programs in the College of Education:
Athletic Training
40
College of Fine Arts and Humanities
Department
Report
Submitted
X
SLOs
Assessed
Come from
Online
Assessment
Plan
X
Findings
Reported
on SLO
Measures
X
Assessment
Discussions
This Year
X
BA Art History
X
X
X
X
BFA Studio Art
X
X
X
X
X
BFA Studio Art Graphic
Design
X
X
X
X
X
BS Art Education
X
X
X
X
X
BS
X
X
X
X
X
BES
X
X
X
X
X
MS
BA Communication
Studies
BA Supplemental
X
X
X
X
Program
Art
Communication
Sciences and
Disorders
Communication
Studies
BA Art
BS Interdepartmental
BS Comm. Arts & Lit.
BES
Changes
Proposed
This
Year
Based on
Data
X
Changes
Implemented
This Year
Data Collected
on
Implementation
of Changes
X
X
X
X
see English
X
Intercultural
Communication Minor
41
English
BA English
X
X
BA Creative Writing
X
X
BS Comm. Arts & Lit.
X
X
X
X
X
X
X
X
X
BES
TESL Minor
MS TESL
X
X
X
MA English
Foreign Language
Spanish BA/BS
X
X
X
French BA/BS
X
X
X
German BA/ BS
X
X
X
X
X
X
X
X
X
Spanish BES
French BES
German BES
Foreign Languages Minor
Russian Minor
Humanities BA
Mass Communication
Music
BS
X
X
X
MS
X
X
X
MM Music
X
X
BA Music
BES
B Mus
BS Music Education
Philosophy
BA
X
X
X
X
X
BES
X
X
X
X
X
42
Interdisciplinary Minor
Philosophy for
Mathematics Minor
Religious Studies Minor
Theatre/Film
Studies/Dance
Theatre BA
X
X
X
X
X
Film Studies BA
X
X
X
X
X
X
Dance Minor
X
X
X
X
X
X
Upper division writing assessment reports received from these programs in the College of Fine Arts and Humanities:
Art
Communication Sciences and Disorders
Communication Studies
Music
Theatre
General Education assessment reports received from these programs in the College of Fine Arts and Humanities:
Art
Communication Studies
Foreign Language
Music
Philosophy
43
College of Science and Engineering
Report
Submitted
SLOs
Assessed
Come from
Online
Assessment
Plan
Findings
Reported
on SLO
Measures
Assessment
Discussions
This Year
Changes
Proposed
This
Year
Based on
Data
Department
Aviation
Program
Aviation BES Minor
X
X
X
X
X
X
X
X
Biological Sciences
Aviation Major BA
Aviation Maintenance
BAS
Aquatic Biology BS
Biology Teaching BS
X
Biomedical BS
X
X
X
Biotechnology BS
X
X
X
Cell Biology BS
Ecology and Field Biology
BS
General Biology BS
X
General Biology BES
X
Changes
Implemented
This Year
Data Collected
on
Implementation
of Changes
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Medical Technology BES
Medical Technology BS
Medical Technology BS
Biological Sciences MA
X
Biological Sciences:
CMOB MS
X
44
Chemistry
Biological Sciences: ENR
MS
X
Biology Teaching MS
Chemistry- Liberal Arts
BA
Chemistry BES
X
X
X
X
X
X
Biochemistry BS
X
X
X
Professional Chemistry
ACS Approved
X
X
X
Chemistry Teaching BS
X
X
X
Computer Science CSAB
Accredited BS
X
X
X
X
X
Applied Computer Science
BS
X
X
X
X
X
Earth Science BA
X
X
X
X
X
X
X
Geology BS
X
X
X
X
X
X
X
Hydrology BS
X
X
X
X
X
X
X
X
X
X
X
X
X
Forensic Science Minor
Computer Science
Computer Algorithmics
Minor
Computer Organization
and Programming Minor
Computer Science MS
Earth and Atmospheric
Sciences
Meteorology BS
Earth Science Teaching BS
Electrical and
Computer Engineering
Computer Engineering BS
45
Electrical Engineering BS
X
X
X
X
X
X
X
X
X
X
X
Mathematics BA
X
X
X
X
X
X
X
Mathematics Teaching BS
Elementary Education
Minor
Mathematics Teachings
MS
X
X
X
X
X
X
X
Mechanical Engineering
BS
X
X
X
X
X
Manufacturing Engineering
BS
X
X
X
X
X
X
X
X
X
X
Electrical Engineering MS
Environmental and
Technological Studies
Environmental Science BS
Environmental Studies BS
Technology Education BS
Technology Management
BS
Environmental Science
BES
Technology Studies BES
Environmental and
Technology Studies MS
Mathematics
Mechanical and
Manufacturing
Engineering
Nursing Sciences
Master of Engineering
Management MEM
Mechanical Engineering
MS
Nursing BS
X
X
Nursing MS (inactive)
46
Physics, Astronomy
and Engineering
Science
Physics BES
X
Physics BS
X
X
X
X
Physics Teaching BS
X
X
X
X
Nuclear Medicine
Technology BS
X
X
X
X
X
X
Radiologic Technology BS
X
X
X
X
X
X
Optics Minor
Regulatory Affairs and
Services
Regulatory Affairs and
Services MS (inactive)
Statistics/Computer
Networking and
Applications
Network Modeling and
Simulation BS
X
X
Network Information
Security Systems BS
X
X
Statistics BS
X
X
X
X
Computer Networking and
Applications-Language
Packages and Operating
Systems Minor
Data Communications
Minor
Computer Networking and
Applications-Language
Packages and
Communications Minor
C.N.A. MS
47
Upper division writing assessment reports received from these programs in the College of Science and Engineering:
Computer Engineering
Earth and Atmospheric Sciences
Electrical Engineering
Mathematics
Network Modeling and Simulation
Nursing
Nuclear Medicine Technology/Radiologic Technology
Physics
Statistics
General Education assessment reports received from these programs in the College of Science and Engineering::
Mathematics
Physics
Statistics
48
College of Social Sciences
Department
Community Studies
Report
Submitted
SLOs
Assessed
Come from
Online
Assessment
Plan
Assessment
Discussions
This Year
Changes
Proposed
This
Year
Based on
Data
Findings
Reported
on SLO
Measures
Changes
Implemented
This Year
X
X
X
X
X
X
X
X
X
X
X
X
Gerontology MS
X
X
X
X
X
X
Gerontology GC
X
X
X
X
X
CJS BA
X
X
X
X
Business Economics BA
X
X
X
X
Economics BA
Mathematical Economics
BS
Applied Economics MS
X
X
X
X
X
X
Program
Community Development
BA
Data Collected
on
Implementation
of Changes
Community Development
BES
Gerontology (minor)
Heritage Preservation
(minor- inactive)
Criminal Justice
Studies
X
CJS MA
Public Safety Executive
Leadership (certificate)
Economics
Public and Nonprofit
Institutions MS
X
49
Ethnic Studies
(exclusively minor
programs)
Geography
African American Studies
X
X
American Indian Studies
X
X
Asian Pacific American
Studies
X
X
Chicano/a Studies
X
X
Ethnic Studies
X
X
BA
BES
Travel and Tourism BA
Land Surveying and
Mapping Science BES
Land Surveying and
Mapping Science BS
Land Surveying and
Mapping Science
(certificate)
MS
Geographic Information
Science MS
Geographic Information
Science GC
Tourism Planning and
Development MS
Global Studies Center
Global Studies BA
History
African Studies Minor
East Asian Studies Minor
50
BA
Latin American Studies
BA
MA
X
X
X
X
X
History Teacher Education
MS
Political Science
BA
X
X
Public Administration BA
X
X
Secondary Education BS
X
X
International Relations and
Affairs BA
Psychology
New
Program
BA
X
X
X
X
Industrial-Organizational
Psychology MS
X
X
X
X
Social Science
Social Science BA
Social Work
Social Science BES
Social Science Education
BS
Social Science Education
MS
BS
X
X
X
X
MS
X
X
X
X
Sociology BA
X
Sociology and
Anthropology
X
X
X
Applied Sociology BA
Sociology
Interdepartmental BA
Anthropology BA
51
Sociology BES
Anthropology BES
Women’s Studies
Cultural Resource
Management MS
X
X
X
X
BA
X
X
X
X
X
BES
Upper division writing assessment reports received from these programs in the College of Social Sciences:
Community Development
Psychology
Social Work
Sociology
General Education assessment reports received from these programs in the College of Social Sciences:
Psychology
52
Herberger College of Business
Report
Submitted
SLOs
Assessed
Come from
Online
Assessment
Plan
Findings
Reported
on SLO
Measures
Assessment
Discussions
This Year
Changes
Proposed
This
Year
Based on
Data
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Department
HCOB
Program
BS Degree
Accounting
BCIS
MBA Degree
Accounting Major and
Minor
BCIS Major and Minor
FIRE
Finance Major and Minor
X
X
Insurance Major and Minor
Real Estate Major and
Minor
Management Major and
Minor
X
X
X
X
X
Management
Marketing and
Business Law
Data Collected
on
Implementation
of Changes
X
X
X
X
X
X
X
X
X
X
X
X
Entrepreneurship Major
and Minor
X
X
International Business
Major and Minor
X
X
Marketing Major and
Minor
Changes
Implemented
This Year
General Business Minor
Non Departmental
Upper division writing assessment:
Assessment of upper division writing was a major focus across the Herberger College of Business this year.
53
54
Appendix C. Summary of Activities and Likely Impacts of
2007-8 Assessment Grants
55
Summary of Activities and Likely Impacts of 2007-08 Assessment Grants
Prepared by James Sherohman, University Assessment Director
Submitted June 25, 2008
College: COSE; Programs: CNMS and CIS
1. Operationally defined competencies and student learning outcomes for select CNMS and CIS
courses
2. Mapped detailed topics for select CNMS and CIS courses to student learning outcomes
3. Developed a databank for systematically aggregating the overall learning outcomes in CNMS
and CIS
4. To develop and administer instruments for assessing the learning impacts of the CNMS and CIS
programs on current students and the graduates [Developed and administered a survey of
graduates to assess the learning impacts of the CNMS and CIS programs and to assess the
congruence between CNMS and CIS learning outcomes and industry requirements.]
Quote: “The C-SLOAP-C project offers new opportunities for faculty members to systematically
connect tests, quizzes, assignments and projects in CNMS and CIS courses to program and course
learning outcomes. The C-SLOAP-C project offers the department a unique opportunity to begin to
use a database to target specific skill areas of CNMS and CIS where students are deficient in
learning.”
College: COSE; Program: Mathematics
1. Develop common final exam questions for MATH 221 (Calculus and Analytic Geometry 1) and
MATH 222 (Calculus and Analytic Geometry 2) to assess program student learning outcomes.
2. Developed scoring rubrics and detailed descriptions of the process used to develop the
measures, so that they can be implemented by multiple instructors.
3. Revised student learning outcomes for MATH 221 and 222.
4. This grant had many participants, so there was a lot of faculty participation and buy-in built into
the project. The documents have been presented to the department and referred to the
Calculus Committee.
College: COSE; Program: Nursing
1. An assessment model was developed for the nursing program that threaded dosage calculation
content from simple to complex through each level of study.
2. Predictors included a greater number of students passing the dosage calculation competency at
first attempt. Faculty believe that restructuring content will lead to student success.
Independent dosage calculation learning modules were created using Gray Morris, 2006,
Calculate with Confidence, Elsevier, electronic resources from clinical listserv, and baccalaureate
listserv. Based on review of listserv resources, minimum passing score was increased from 88%
to 90%.
3. This spring was the first cohort using this assessment model. Assessment will continue as these
students and subsequent students progress through the program.
4. 100% of Level One students participated in the Dosage & Solutions Independent learning
module in Spring 2007 and Spring 2008. 68% of Level One students passed the competency
56
evaluations at first attempt Spring 2007. 90% of Level One students passed at first attempt
Spring 2008 in the new model.
5. The assessment model was reviewed with Level coordinators for their input on the
appropriateness of the content at each level. The independent learning modules and
assessment strategies will be incorporated into each level as the students progress.
College: COSE; Program: Aviation
1. The original intent was to focus primarily on revising and improving assessment tools and
methods, but a change in the standards of the department’s accrediting body made it necessary
to revise the assessment plans for all programs prior to moving to that step.
2. Revision of the assessment plans included mission, vision, and values, student learning
outcomes, program matrices, and timelines. Assessment tools also are identified.
3. The grant has helped to put the department in position to conduct effective assessment in the
future. The matrices and timeline indicate what needs to be assessed and where; they have
identified multiple data sources in addition to courses; they have at least a tentative plan for
regular assessment discussions; and they seem to be have the annual reporting process well
connected to the requirements of their accrediting body.
College: COSE; Programs: Mechanical Engineering and Manufacturing Engineering
1. This grant focused primarily on developing measures and rubrics to assess the twelve learning
outcomes for MME programs.
2. Data sheets were created for each of the twelve learning outcomes. Each data sheet breaks the
outcome into one or more performance criteria, with a 5-level rubric for each criterion. This is
followed by a list of data sources used to assess each rubric, including collection frequency. Also
included are course maps for each program that show all courses that contribute to that
outcome and which ones are used for data collection.
3. New data collection methods will be implemented starting in the fall of 2008.
Quote: “There is now an organized and documented method of assembling data from all sources to
provide a better picture of where we stand. The methods also distribute data collection and
outcome assessment so part of the process is done every semester. This makes it more a matter of
habit and does not force the every six year workload of assembling all the data for the next
accreditation (ABET) review. Each outcome is assessed every three years, or two done per semester.
That provides two complete assessment, evaluation, and improvement cycles within our six year
accreditation cycle.”
College: CoFAH; Programs: Communication Studies B.A., interdisciplinary (non-teaching) B.S., and
supplemental major and minor
1. Student learning outcomes were developed for these four revised programs, and relevant
traits/components were identified for each student learning outcome.
2. An assessment matrix was developed that identifies where, in each program, these student
learning outcomes could be measured.
3. A timeline was developed for assessing the student learning outcomes for each program.
57
4. The assessment plan for the 40 credit B.A. was approved at the final department meeting of the
year. The assessment plans for the other two majors and the minor are based on that plan, but
the department has not yet approved them.
5. In the process of developing the assessment plan for the interdisciplinary B.S. major, a gap was
discovered between the coursework and learning outcomes for that major that requires a
curriculum revision.
College: CoFAH; Program: Russian
1. Instruments (mostly tests) used to assess Russian at other institutions, both in the US and
abroad, were researched, and a bank of such tests that could bring coherence into assessment
of student learning outcomes at SCSU was identified. At the end of each year (first, second, and
third) students would take a test to evaluate their writing, reading, and speaking knowledge of
the language. Their scores would be compared to requirements for the appropriate level of
studies (novice, intermediate, and advanced), based on the ACTFL guidelines now set as
guidelines for student learning outcomes in Russian.
2. The Russian program currently does not have an assessment plan. Although the department
uses guidelines developed by disciplinary and accrediting bodies, they haven’t formally adopted
them as student learning outcomes, and they are general in nature rather than pertaining just to
Russian programs. The information obtained through this assessment grant will be used this fall
to develop an assessment plan for the Russian program.
College: COSS; Programs: Gerontology minor, M.S., and graduate certificate
1. This project focused on revision of the assessment plans for the Gerontology minor, M.S., and
graduate certificate. Student learning outcomes were revised and a program matrix and timeline
were created.
2. The resulting assessment plan goes beyond the basic components recommended by the
Assessment Steering Committee. It identifies assessment methods to be used in specific
courses, and it specifies procedures through which faculty efforts will be coordinated across
courses.
3. The assessment plan was implemented for 2008 and the annual assessment report (due
September 15) already has been submitted for each of the three programs.
College: COE; Program: College-wide
1. This project assessed intercultural competencies of teacher candidates who are exiting from the
College of Education. Data were collected. Findings are broken down by department
(elementary, secondary, special education, CFS).
2. The data will become part of a larger study with comparative data from South Africa, Chile, and
China. The results of this larger study will be used to assist in program development and
institutional partnership opportunities for promoting intercultural competence.
3. A report and recommendations will be compiled and provided to departments. Areas will be
identified for improvement.
College: COE; Program: Doctor of Education in Higher Education Administration
58
1. Five assessment rubrics were created for use in evaluating admissions qualifications and
progress through the coursework for the Doctor of Education program in Higher Education
Administration.
2. The rubrics correspond to the five key transition points in the program. The rubrics clarify what
is required for admission and for satisfactory progress through the program. In the future, the
rubrics may be used to evaluate the effectiveness of course sequencing.
3. Two of the five rubrics have been approved and implemented by program faculty. The other
three have received preliminary approval and are scheduled for implementation in fall 2008
through August 2010.
4. Based upon this work, a conference proposal on rubric design and outcomes has been
submitted.
College: CoFAH; Programs: Linguistics emphasis in English BA, TESL BS Minor, TESL MA, K-12 ESL
licensure
1. The original plan was to use data on student grades from ISRS, but technical complications
prevented that. DARS data were used instead, resulting in limitation of the study to TESL MA
program students only and use of Pearson’s correlation coefficient instead of regression
analysis. The analysis correlated course grades with scores on the Praxis II TESOL exam.
2. Baseline correlations were established for the master’s program. Grades for most courses were
significantly correlated with Praxis exam scores.
3. Grades for several courses had low correlations with the Praxis score. The report concluded that
qualitative measures of student performance should be developed for these courses.
4. The overall score on the TESOL Praxis II exam was used for this analysis because that is what is
entered in the ISRS and DARS data systems. The exam does have sub-scores that correspond
much more closely to program student learning outcomes. These sub-scores would be quite
useful for assessment purposes, and a way should be found to make them available.
Division: Student Life and Development; Department: division as a whole
1. This grant purchased copies of The Assessment Practice in Student Affairs: An Application
Manual for each department and center in Student Life and Development. The books are being
used as a reference by the departments to develop student learning outcomes, which will align
with the student learning outcomes for the division as a whole.
2. Student Life and Development is conducting an online survey to measure assessment skills &
knowledge within SLD division. Results from that survey will be used to determine training
needs for the division. The book also will be used for the training.
3. The grant also purchased 500 magnets with the SLD learning outcomes stated on them. These
magnets will be passed out during freshman orientation in the fall of 2008 to familiarize
students with the learning outcomes.
4. This project fits into overall plans for SLD assessment. The impact of the grant will not be known
until these plans are more fully implemented.
College: CoFAH; Program: Communication Sciences and Disorders Graduate Program
1. This was the only grant that was funded in the category of “integrating assessment of online and
on-campus program components.” Funding was provided by the Dean of Continuing Studies.
59
2. Assessment tools were developed to monitor student performance, evaluate faculty
participation, obtain information from internship supervisors about ethical issues, and evaluate
the effectiveness of instructional techniques.
3. The project focuses primarily on the course level (CSD 677: Ethical and Professional Issues in
Speech-Language Pathology). Although the connection with program student learning outcomes
is not clearly identified, it probably exists. CSD 677 appears to be a required course in the
program, and the results of the grant will be disseminated at a department meeting in the fall.
Unit: LR&TS; Program: Master’s in Educational Media
1. Dispositions that describe successful degree candidates were identified and connected with
program goals and courses. Instruments for assessing these dispositions were created, and
procedures for systematically gathering data on these dispositions were developed.
2. Assessment of dispositions is now an integral part of the assessment of library media specialist
candidates for licensure. Dispositions will be included on course syllabi and will be emphasized
throughout the program. All faculty teaching graduate courses will assess dispositions.
3. Two needed curriculum changes were identified: addition of a prerequisite for one course and
addition of a one-credit independent study for non-SCSU students to prepare materials for
licensure application.
60
61
Appendix D. Progress Towards Goals of Student
Learning Projects for the HLC Assessment Academy
62
UNIVERSITY ASSESSMENT SYSTEM PROJECT
Goals and Tasks
Timeline
Method for
Monitoring Progress
Indicators of Success
Status as of Status as of
11/07
01/09
1) Refine institutional
assessment policies
a)
use
recommended
by Assessment
Steering
Committee
use
recommended
by Assessment
Steering
Committee
use
recommended
by Assessment
Steering
Committee
use
recommended
by Assessment
Steering
Committee
Define assessment
plan
Spring
2007
committee approval
process
approval by Assessment
Steering Committee
b) Create template for
annual reports
Spring
2007
committee approval
process
approval by Assessment
Steering Committee
a) Include all policies
and forms on website
Spring
2007
policy documents and
forms posted on website
templates and guidelines for
reports and assessment
plans posted on Assessment
website
Completed
Completed
b) Disseminate
assessment resources on website
Spring
2008
resource content posted
on Assessment website
all parts of Resources page
updated and reorganized
In process
Completed
at least 5 assessment
directors or consultants per
year attend assessment
conferences off campus
$200 subsidy
will be provided
to peer
consultant
trainees
3 college ADs
at HLC in
2008, 3 in
2008; several
PCs attend '08
Assessment
Institute
2) Communicate
institutional assessment
policies
3) Build institutional
assessment capacity
a) Expand professional
development opportunities for
assessment directors and
consultants
Fall 2007
number attending
assessment conferences
off campus
63
UNIVERSITY ASSESSMENT SYSTEM PROJECT
Goals and Tasks
Timeline
Method for
Monitoring Progress
Indicators of Success
Status as of Status as of
11/07
01/09
3) Build institutional
assessment capacity
Fall 2007
Amount of funds available
maintain or increase funding
from previous year
$30,000
available for
2007-08;
$25,712
awarded
c) Train assessment
consultants
Fall 2007
number of consultants
trained
train 10 assessment peer
consultants per year
None yet; first
training in
January 2008
21 trained in
January 2008,
16 in October
2008
d) Assist programs
through peer consulting program
Fall 2008
number of programs
assisted
assist 10 programs per year
None yet
6 assisted
since January
2008
b) Fund assessment
grants
$20,000
available for
2008-09
e) Encourage faculty
participation in sessions offered
by visiting assessment expert at
two-day all-University event
Spring
2008
number of faculty
participants
participation of 200 faculty
members in at least one
session
N.A. - event is
in January
Attendance at
workshops:
100
assessment,
75 grading, 35
general
education
f) Disseminate
assessment resources
Spring
2008
resource content posted
on Assessment website
all parts of Resources page
updated and reorganized
In process
Completed
Spring
2008
number of programs using
system
25 programs using data
system for assessment
Proof of
concept
underway
Still in pilot
phase
g) Implement data
system
64
UNIVERSITY ASSESSMENT SYSTEM PROJECT
Timeline
Method for
Monitoring Progress
Indicators of Success
a) Offer assessment
workshops on campus faculty
development days
Spring
2007
number of workshops
offered
offer at least 4 workshops per
year
2 in January
2007, 3 in April
2007, 3 in
August 2007
2 in April 2008,
2 in August
2008, 3 in
January 2009
b) Write a position
paper on assessment, faculty
workload, and the reward system
Spring
2009
committee approval
process
approval by Assessment
Steering Committee
In process
In process
Goals and Tasks
Status as of Status as of
11/07
01/09
4) Integrate assessment
into work life of institution
5) Implement
assessment of assessment
a) Develop a plan for
assessing assessment at an
institutional level
Fall 2008
committee approval
process
approval by Assessment
Steering Committee
Not
implemented
b) Implement rubric to
assess program assessment
plans
Spring
2011
implementation at level of
academic college
implemented in all 5 colleges
Not
implemented
c) Implement rubric to
assess program annual reports
Spring
2011
implementation at level of
academic college
implemented in all 5 colleges
Not
implemented
d) Review assessment
plans and provide feedback to
programs/departments
Spring
2011
% of programs provided
feedback
all departments in the college
receive feedback
Not
implemented
College of
Science and
Engineering
pilot project
being
implemented
College of
Science and
Engineering
pilot project
implemented
College of
Science and
Engineering
pilot project
implemented
College of
Science and
Engineering
pilot project
being
implemented
65
UNIVERSITY ASSESSMENT SYSTEM PROJECT
Goals and Tasks
Timeline
Method for
Monitoring Progress
Indicators of Success
Status as of Status as of
11/07
01/09
5) Implement
assessment of assessment
e) Review annual
reports and provide feedback to
programs/departments
f)
Improve institutional
assessment policies and
procedures based upon
assessment of assessment
Spring
2011
% of programs provided
feedback
100% of programs provided
feedback
Not
implemented
College of
Science and
Engineering
pilot project
being
implemented
Fall 2010
number of improvements
implemented
at least one improvement
implemented
Not
implemented
Not
implemented
66
ASSESSMENT OF STUDENT LEARNING IN PROGRAMS PROJECT
(PRIMARY ASSESSMENT ACADEMY PROJECT)
Goals and Tasks
Timeline
Method for
Monitoring Progress
Indicators of Success
Status as of
11/07
Status as of
01/09
University- 85%
(115/136)
COE- 86%
COFAH- 94%
COSE- 100%
COSS- 92%
HCOB- 10%
University- 50%
(68/136) COE7% COFAH18% COSE94% COSS65% HCOB0%
University- 12%
(16/136) COE0% COFAH21% COSE0% COSS35% HCOB0%
University- 92%
(127/138) COE86% COFAH86% COSE100% COSS89% HCOB100%
University- 73%
(101/138) COE7% COFAH62% COSE100% COSS71% HCOB100%
University- 48%
(66/138) COE0% COFAH45% COSE62% COSS68% HCOB0%
University- 18%
(26/138) COE0% COFAH7% COSE20% COSS50% HCOB0%
1) Implement institutional
assessment plan at level of
major programs
a. Submit mission
statement for posting on website
Spring
2007
% of major programs
meeting
95% meet
b.
Submit student
learning outcomes (SLOs) for
posting on website
Spring
2007
% of major programs
meeting
95% meet
c.
Submit program
matrix (curriculum map) for
posting on website
Spring
2008
% of major programs
meeting
95% meet
d.
Submit timeline for
posting on website
Spring
2008
% of major programs
meeting
95% meet
University- 7%
(9/136) COE0% COFAH9% COSE- 0%
COSS- 17%
HCOB- 0%
67
ASSESSMENT OF STUDENT LEARNING IN PROGRAMS PROJECT
(PRIMARY ASSESSMENT ACADEMY PROJECT)
Goals and Tasks
Timeline
Method for
Monitoring Progress
Indicators of Success
Status as of
11/07
Status as of
01/09
University- 70%
(95/136) COE40% COFAH52% COSE74% COSS90% HCOB90%
University- 49%
(67/136) COE40% COFAH33% COSE62% COSS48% HCOB60%
University- 53%
(72/136) COE40% COFAH52% COSE62% COSS39% HCOB80%
University- 68%
(86/126) COE75% COFAH71% COSE78% COSS38% HCOB100%
University- 58%
(73/126) COE88% COFAH71% COSE64% COSS21% HCOB70%
University- 54%
(68/126) COE88% COFAH68% COSE58% COSS17% HCOB70%
2) Implement institutional
assessment reporting system
at program level
Spring
2007
% of major programs
submitting report each
year
95% meet
annually
Spring
2008
% of programs collecting
each year
85% meet
c. Use direct measures
of student learning
Spring
2008
% using at least one direct
measure each year
70% meet
a. Submit annual
assessment report
b.
Collect data
(Task "d" deleted in 2008
revision.)
e.
Describe findings
in annual report
Spring
2008
% including summary of
findings for at least one
student learning outcome
in report each year
85% meet
University- 47%
(59/126) COE63% COFAH71% COSE40% COSS17% HCOB70%
68
ASSESSMENT OF STUDENT LEARNING IN PROGRAMS PROJECT
(PRIMARY ASSESSMENT ACADEMY PROJECT)
Goals and Tasks
Timeline
Method for
Monitoring Progress
Indicators of Success
Status as of
11/07
Status as of
01/09
University- 50%
(68/136) COE40% COFAH48% COSE60% COSS35% HCOB70%
University- 67%
(85/126) COE88% COFAH68% COSE51% COSS86% HCOB70%
University- 37%
(46/126) COE13% COFAH38% COSE51% COSS14% HCOB50%
University- 22%
(28/126) COE38% COFAH18% COSE31% COSS10% HCOB20%
2) Implement
institutional assessment
reporting system at program
level
f. Meet annually to
discuss assessment results or
practices
Spring
2008
% of programs meeting at
least once per year to
discuss results or
assessment practices
95% meet
g.
Use data to
improve student learning
Spring
2009
% citing at least one use
for program improvement
in past three years
65% meet
h.
Use data to
improve assessment
Spring
2009
% citing at least one use
for program improvement
in past three years
65% meet
i. Complete data
collection cycle
Spring
2013
% that have assessed all
outcomes since 2007
65% meet
j. Obtain direct
measures of all student learning
outcomes
Spring
2013
% using at least one direct
measure for each SLO
since 2007
65% meet
69
GENERAL EDUCATION ASSESSMENT PROJECT
Timeline
Method for
Monitoring Progress
Fall 2006
committee approval
process
Spring
2008
committee approval
process
Spring
2007
Director is hired
Director is hired
Spring
2008
% of teams with 3 or more
members
100% have 3 or more
members
Fall 2008
% of areas with
assessment plans
100% have assessment plans
d.
Publish guidelines
for assessment by GETGOs
Fall 2008
committee approval
process
e.
Develop rubrics for
SLOs in each goal area
Fall 2008
committee approval
process
Spring
2009
committee approval
process
Goals and Tasks
1)
Indicators of Success
Status as of Status as of
11/07
01/09
Approve the program
a.
Approve new
General Education assessment
plan
b.
Approve new
General Education program
approval by Faculty Senate
and agreed to at Meet &
Confer
approval by Faculty Senate
and agreed to at Meet &
Confer
Completed
2) Implement program
structure
a.
Hire General
Education Assessment Director
(GEAD)
b.
Select crossdisciplinary General Education
Teams for Goal Oversight
(GETGO)
c.
Develop an
assessment plan for each goal
area
f.
Revalidate all
General Education courses
Completed
approval by Faculty Senate
and agreed to at Meet &
Confer
approval by Faculty Senate
and agreed to at Meet &
Confer
approval by Faculty Senate
and agreed to at Meet &
Confer
70
GENERAL EDUCATION ASSESSMENT PROJECT
Timeline
Method for
Monitoring Progress
Indicators of Success
a.
Use data from the
NSSE and CAAP to inform
General Education assessment
Spring
2008
General Education
Assessment Committee
identifies goal areas for
which these instruments
use useful
GETGOs for which these
instruments are useful include
pertinent results in
assessment report
b.
Collect baseline
data from sample of courses in
each goal area
Spring
2009
% of GETGOs that have
collected baseline data
100% meet
c.
Complete annual
assessment report in each goal
area
Spring
2009
% of GETGOs that have
completed annual report
100% meet
d.
Use direct
measures of student learning
Spring
2009
% of GETGOs using at
least one direct measure
each year
100% meet
e.
Include
assessment tools with annual
report
Spring
2009
% of GETGOs attaching at
least one assessment tool
to the report each year
100% meet
f.
Present key
findings in annual report
Spring
2009
% of GETGOs including
summary of key findings in
report each year
100% meet
g.
Complete the
data collection cycle
Spring
2012
% of GETGOs that have
assessed all outcomes
since 2007
100% meet
h.
Obtain direct
measures of all student learning
outcomes
Spring
2012
% of GETGOs using at
least one direct measure
for each SLO since 2007
100% meet
Goals and Tasks
3)
data
Status as of Status as of
11/07
01/09
Collect and analyze
71
GENERAL EDUCATION ASSESSMENT PROJECT
Timeline
Method for
Monitoring Progress
Indicators of Success
a.
Hold initial annual
meeting to discuss assessment
results
Spring
2009
% of GETGOs meeting at
least once per year to
discuss results
100% meet
b.
Use data to
improve the General Education
Program
Spring
2010
% of GETGOs citing at
least one use for program
improvement in past three
years
100% meet
c.
Use data to
improve the General Education
assessment system
Spring
2010
% of GETGOs using at
least one direct measure
for each SLO since 2007
100% meet
Goals and Tasks
Status as of Status as of
11/07
01/09
4) Use data for
improvement
72
73
Appendix E. Reports on Undergraduate Studies
by Amos Olagunju
74
Undergraduate Studies 2007-08 Assessment Report
and Assessment Plans for 2008-2009
Prepared By Amos Olagunju, Undergraduate Studies Director
Submitted October 1, 2008
1. Summary
Areas of Assessment Distinction
 Student performance in Reading 110 & 120 is measured by pre and post tests
 Student performance in Math 070 & 072 is measured by completion rate, performance
relative to developmental programs at other institutions
Assessment Plan for 2008-2009
Undergraduate Studies will collect, analyze and report on:
Quantitative Data
 Student persistence rates: fall to spring, first to second year, declare major, transfer rate,
leave school
 Achievement – grade point average and course completion for DGS, FYE and Honors
students
 Analysis of persistence data – how well are we doing with students of various
backgrounds (i.e., relative to high school achievement, racial/ethnic/cultural, residency
status, socioeconomic status, first-generation status)?
 Completion rates – completed credits/attempted credits
 Probation & suspension rates
Qualitative Data
 Student surveys of orientation and the first-year experience (what is working, what is not,
what is missing)
 Focus group interviews with students on advising
 Feedback from Academic Resource Mentors, staff, and faculty
2. Academic Learning Center’s Assessment Plan
 Pre- and post-test for Reading 110 and Reading 120 has been set up
o RDNG 110—Learning and Study Strategies Inventory has been administered and
assessment procedures of follow up is under development
o RDNG 120—Nelson Denny Reading Assessment test
 Collect data from new tutor program and generate report
o Survey of student satisfaction
o Use of a scheduling program that will track student use of program; frequency of use,
reasons for coming, students’ classification and ethnicity
 DGS First-Year Experience cohorts
o Continue collecting data on retention and academic progress of students in last year’s
cohort
o Collect data on retention and academic progress of students in both of this year’s
Learning Communities
o Evaluate effectiveness of paired classes using same criteria—retention and academic
progress
75
 Work with Deborah Bechtold in Institutional Research in continuing analysis of Reading 110
student data, retention and graduation rates, as well as course GPA and its relationship to
cumulative GPA
3. Advising Center’s Assessment Plan




Develop student learning and development outcomes for Advising Center
Use focus groups to assess learning and development outcomes
Develop survey (Mobile Assessment) to be given to students following appointments
Send web-based survey to advisees who did not make an appointment
4. Division of General Studies’ Assessment and Plan
 DGS admissions data indicate the retention is down 5.4% from 2006/2007), but the current
enrollment of 450 is up from the 396 students last year.
 COLL 150 and RDNG 110 are identified as the threshold course requirements for the DGS
program and as such assessment procedures in either course reflect the overall DGS program.
o College Student Inventory (CSI) has been administered for COLL 150 students and
assessment procedures are being developed.
o 22 sections of COLL 150 are being offered with 19 instructors. Assessment protocols
and procedures in each section are being developed to conform to the Academic Program
Assessment Plan of the University.
o The DGS program is working with the DGS Advisory Committee to review past
curricular practices and current research with the goal of recommending curricular and
programmatic changes appropriate to students’ needs, the Presidential Work Plan,
Institutional Initiatives, the Academic Action Plan, Budget Procedures and Assessment
and other areas of strategic planning.
o Four Academic Resource Mentors (ARMs) are working with the overall DGS and
individual course sections. Data will be collected and analyzed to generate a meaningful
report.
o Two Graduate Assistants and one Intern are working with the DGS Faculty Director with
primary focuses on assessment and curriculum development.
 The Office of Institutional Research is presently helping to identify the courses taken by DGS
students with a focus on tracking their admission designations and first year courses. The data
will be used to develop assessment protocols and procedures. DGS will identify admissions
threshold, DGS academic program threshold (courses required), courses taken by DGS
student with consideration of their admission designations, first year courses taken by DGS
students, student support services specified for DGS students, and student support services
utilized by DGS students
 Multi-cultural Student Services, Admissions, and Advising have identified themselves as
student support services working with DGS students. The DGS will identify other campus
services and student resources in an effort to develop comprehensive supportive experiences
for DGS students.
5. First Year Experience and Transitioning Program’s Assessment Plan
 FYE/TP will implement a way to “fast-track” the approval of the FYE learning communities
which have a proven track record of solid enrollment, e.g. pre-med, meteorology, “Thinking
Out Loud,” DGS “Digital Connection,” etc.
 FYE/TP will follow up with units (e.g. Nursing) which had discussed developing FYEs last
year, as well as units interested in setting up FYEs focused on specific student populations,
e.g. veterans
76
 FYE/TP will take advantage of the opportunity afforded by its association with the
Foundations of Excellence to rethink the FYE model, with an eye toward doing what works
in the SCSU institutional environment
 FYE/TP will review the FYE learning goals and develop a suitable assessment strategy, and
develop a real budget and appropriate cost management structure for FYE
6. Honor’s Program Assessment and Plan
 Served 150 first-year Honors students and approximately 225 returning Honors students;
freshman enrollment doubled from 75 to 150 but the retention rate of returning honors
students dropped 4.8%
 Phased in two credit hour first-year Honors' Seminar and one-credit hour Research
Colloquium for all new students; all students conducted research under guidance of mentors
and made presentations to classmates, faculty, administrators and parents.
 Honors students overwhelmingly evaluated their seminar experiences as enjoyable. Honors
faculty members who were evaluated by the students received high ratings.
 Develop an operational teaching guideline with embedded assessment for Honors' faculty and
present the formal student evaluations of seminar activities and courses.
7. Orientation Assessment
 The Welcome Picnic was well-liked by the students and their families. Orientation Leaders
were passionate and made commendable efforts to aid student participation in all activities.
Attendance at sessions improved over 2007 Orientation.
 Effective Check-in procedures need to be developed for all students. The importance and
priorities of Orientation Task Force should be communicated to new students. The awfully
low attendance at academic sessions requires attention.
 The survey report of students’ satisfaction with the fall 2008 orientation activities is
forthcoming
8. Probation and Suspension Assessment
 Contact and assist all students on probation to create plans for making the best use of SCSU’s
student support services and resources to achieve success.
 Contact professors before midterm to determine the academic success of probationary
students in their current courses.
 Continue to create academic success plans for suspended students.
77
78
Appendix F. Reports on Learning Resources and
Technology Services by Christine Inkster
79
LR&TS Assessment Report 2007-2008
Prepared by Chris Inkster, Coordinator of LR&TS Assessment
Submitted October 1, 2008
Overview
LR&TS strives to support all academic programs at SCSU by contributing to the learning and
research activities of both students and faculty through the library and technology resources
and services provided to the campus. Thus, LR&S does not have a formal academic program
comparable to department or college programs. This Assessment Report reflects the implicit
student learning outcomes that form the foundation for LR&TS library and technology resources
and services, and represent the broad goals of LR&TS regarding student learning and studentcentered resources, support, and services.
Assessment at LR&TS
LR&TS Organization
LR&TS is organized into work groups based on services provided to the campus. These work
groups are related to the library (Reference*, Access*, and Collection Management*), to
technology (InforMedia Services*, Information Technology Services, and Instructional
Technologies and Infrastructure Services), and to teaching credit courses (Center for
Information*, assessed through the College of Education). Faculty members at LR&TS are
members of all groups indicated with an asterisk. Both faculty and staff have wide-ranging
responsibilities for helping students learn in a variety of environments within LR&TS.
LR&TS Coordinator of Assessment
LR&TS has had a half time faculty position focused on assessment since 2003. Until fall 2005,
when the current coordinator was assigned to this role, the position was shared by two faculty
members. The coordinator's role is to facilitate assessments efforts in all of LR&TS. Major
responsibilities include representing LR&TS on the University Assessment Committee; planning,
implementing, and analyzing two annual major assessment projects; sharing analysis of
assessment projects with the Dean's Advisory Council and work group leaders; serving as a
resource person for others in LR&TS wishing to engage in assessment; encouraging the use of
assessment data in decision-making and making pertinent data available to decision-makers;
working with LR&TS Assessment Committee members to improve assessment procedures and
analysis; gathering assessment data related to LR&TS that is needed by other SCSU units; and
serving as a point person for any assessment-related activities within LR&TS.
Major Annual Assessment Projects
The LR&TS Coordinator of Assessment facilitates two major annual assessment projects. Both of
these surveys are indirect measures of student awareness and satisfaction with resources and
services provided by LR&TS to the campus. The Miller Center Survey has been administered
four times and collects information from students using the building during a week in the spring
semester. A Telephone Survey conducted by the SCSU Survey Center collects similar information
from students who may or may not have used the Miller Center. Long-term comparison of
results is now possible as these two assessments have been revised and conducted regularly
since 2005. The LR&TS Assessment Coordinator analyzes and shares final data from each of the
80
surveys as the data is made available. Thus work groups are kept informed of assessment
results that could inform continuous improvement throughout the year.
In spring 2008, 300 students participated in the Miller Center Survey and 508 students
participated in the Telephone Survey.
Other LR&TS Assessment Projects
In spring 2007, LR&TS participated for the first time in LibQUAL+, a national Web-based survey
that invited undergraduate students, graduate students, and faculty to share their perceptions
and expectations for a wide range of library services. The LibQUAL+ results were analyzed in fall
2007 and have become a focus for areas to improve in library resources and services. In 2010,
the LibQUAL+ survey will be repeated.
In addition, this year several work groups conducted small-scale, focused surveys and
evaluations of services and resources. The LR&TS Coordinator of Assessment was available to
collaborate with the work groups on these assessment projects so that data collected would be
usable, comparable, and shared as needed.
In 2007-08, these projects included:
Evaluation of library instruction sessions (Reference; N = 1,850 students in 115 sessions)
Evaluation of Reference Desk service (Reference; N = 142)
Evaluation of study rooms by study room users (Access; N = )
Evaluation of technology and support of e-classrooms (ITIS; N = 75)
Dean's Advisory Group (Dean's Office; N = 12)
Library Space Use Study (four faculty from Reference and IMS; N = 3,996 students
whose library space use was observed and classified)
LR&TS Assessment Committee
The LR&TS Assessment Committee, which includes representatives from several work groups as
well as the LR&TS Associate Dean who oversees assessment efforts, meets periodically to advise
on assessment projects, helps with revisions of assessment questions and format, and assists in
the analysis of the data collected. The LR&TS Coordinator of Assessment meets as needed with
the LR&TS Associate Dean to discuss assessment issues.
Annual Assessment Report
The LR&TS Coordinator of Assessment prepares an annual report that analyzes the data from
these various sources, including data from SCSU reports such as NSSE, Graduating Senior Survey,
Graduate Study Survey, and others as available. As analysis of the individual assessment
projects is available, it is presented to members of the Dean’s Advisory Council so they can
begin using the data to make changes and improvements. When the complete report is
completed, it too is presented first at the Dean's Advisory Council.
Follow-Up to the Annual Assessment Report and Improvements Implemented
Work group leader meetings
The Coordinator of Assessment meets individually with each work group leader to discuss data
related to their area. The report and all appendices of data are made available online for all
LR&TS employee to access.
81
Dean’s Advisory Council meetings
Several weeks after the assessment project analysis and the Assessment Report are presented, a
DAC meeting is devoted to discussion of the report and decisions are made for follow-up
research, changes, and improvements. The report with links to the appendices is also made
available in the reports section of the LR&TS Website. These reports were used several times
later in the year during decision-making discussions.
Workgroup improvements and changes
Reference librarians have discussed pedagogical methods that may be successful in increasing
students' satisfaction with library instruction sessions. For instance, Vision software installed in
the library instruction classroom, allowing librarians to demonstrate to the student's computer
screens, has made it easier for students to see and understand the intricacies of searching
databases and catalogs. The technical problems with this software that were experienced in the
past seem to have finally been resolved by the vendor and the LR&TS technicians. Reference
librarians have also committed to seeking increased communication with professors asking for
library instruction sessions. Almost all reference librarians have learned LibData software and
are using it to create course-specific Course Guides to assist students in locating the most
germane information for their research. Several reference librarians are using Captivate
software to create tutorials that can be distributed via email or the Web to demonstrate
strategies for using key databases and other library resources. Reference librarians have also
discussed ways to improve student satisfaction with help at the Reference Desk. The creation of
more individual, point-of-need tutorials is one goal the group is working towards. The library has
also joined AskMN, a consortium of libraries that provides 24/7 reference assistance through
online chat; it is hoped that this extended service will improve students' access to reference
help after Reference Desk hours.
LR&TS improvement goal
As an example of using assessment data for improvement for the entire LR&TS unit, after the
LibQUAL+ data was shared and discussed in fall 2007, the DAC decided to focus on improving
student worker knowledge and helpfulness because this was identified in the survey as an area
that did not meet expectations of library users. A customer service training program was
implemented in the spring semester 2008 to assist student workers in developing and learning
these important skills, and service desks used other methods to encourage student workers in
learning this facet of their jobs. The goal is that when we again use the LibQUAL+ survey (in
2010), student worker knowledge and customer service skills will be positively perceived.
Poster about improvements for students to see during assessment activities
The Coordinator of Assessment prepared a poster listing specific changes and improvements
made as a result of student responses on the two major surveys to display when the spring
Miller Center Survey was conducted so that students know that their suggestions are listened to
and improvements are made. Changes made at LR&TS that were suggested by students in 200607 assessment data and that are closely aligned with student learning and performance in their
classes included the following:
 Book collection not always adequate for research needs
o Purchased and added more than 10,000 books and more than 12,000 e-books
 Some journal articles not available at SCSU
o Selected and purchased increased access to fulltext journals
82





o Improved FindIt! link, making it easy to request a copy from another library
Not enough e-classrooms on campus
o Campus now has more than 120 e-classrooms
Some areas of Miller Center are noisy, especially cell phones
o Increased signs and efforts to encourage students to use cell phones only in
lobby area; plans are underway for designating Quiet Areas and Group Areas
Not aware of computer software workshops for students
o Added easel and poster on 2nd floor to advertise times of free workshops
Not enough printers
o Purchased and added a double-sided printer in Reference area
Writing a bibliography is hard
o Increased number of RefWorks workshops on learning to create bibliographies
Anticipated improvements
Almost all of these improvements (as well as others not included on this list) were already
anticipated by LR&TS work groups before the Annual Assessment Report was completed.
Planning for improvement and implementing changes is often a lengthy process, and LR&TS
work groups continually strive to improve services and resources.
Summary of LR&TS Assessment Related to Student Learning
Questions and responses that most closely interconnect with student learning and LR&TS faculty
instructional roles have been selected from the Miller Center Survey and the Telephone Survey.
Other data are from smaller-scale surveys focused on specific services and conducted by LR&TS
workgroups. All data are from indirect rather than direct measures and represent student selfreports.
Program learning
outcomes
assessed this year
Where did you
assess this
learning
outcome?
Key findings**
Assessment methods and tools*
(How did you assess this student learning outcome?)
(Course? Other
activity?)
Students in library
instruction sessions
will report increased
confidence in being
able to locate
research
appropriate for their
assignments
At conclusion of
115 sections of
professorrequested library
instruction
sessions (N =
1,850)
Survey of students
in Miller Survey (N
= 300)
(Briefly describe what
the results show about
student learning. How
well was the outcome
met? Suggested length
25 to 50 words.)
Brief half-page self-report evaluation filled out
by 1,850 students attending library instruction
sessions with their classes. Anonymous
evaluation forms were entered into a
spreadsheet by an adjunct librarian. Library
faculty are then able to look at responses from
all of their library instruction sessions in order
to make improvements.
91.9% indicated
they were more
confident about
doing research for
the class; 7% were
not sure.
Students were asked if they had used and were
satisfied with library instruction sessions.
83% of students
who attended
sessions were
93% reported that
the session was
helpful; 6% were not
sure.
83
Students who seek
assistance from the
Reference Desk (in
person, by phone,
or by email) will
report satisfaction
with the help they
received.
satisfied (down from
88% in 2007); 41%
were aware of the
service but had not
used it
Virtually 100% of the
students who
returned forms were
highly satisfied with
the assistance they
received.
Surveys are
distributed to
students who seek
help at the
Reference Desk
during one week in
fall semester and
one week in spring
semester
Brief quarter-page self-report evaluation filled
out by students who ask for assistance at the
Reference Desk. Anonymous forms were
analyzed by the Reference Coordinator, who
shared general trends with all Reference
librarians. Although more questions were
answered during the weeks of the evaluation,
142 forms were returned.
Survey of students
in Miller Center (N
= 142)
Students were asked if they had used and were
satisfied with asking for assistance at the
Reference Desk.
Telephone survey
of students (N =
508)
Students were asked if they were satisfied with
assistance at the Reference Desk
Students who seek
assistance with D2L
will report
satisfaction with the
help they received.
Students in classes
that use D2L will
report that using
D2L improves their
learning and class
performance.
Survey of students
in Miller Center (N
= 300)
Students were asked if they had used and were
satisfied with assistance they received with
D2L.
Survey of students
in Miller Center (N
= 300)
Students were asked if using D2L as part of
their class improves their learning.
83% of students
who had used D2L
agreed that using it
improved their
learning
Students who
participate in
technology training
workshops will
report satisfaction
with the workshops.
Survey of students
in Miller Center (N
= 300)
Students were asked if they were satisfied with
the technology workshops.
82% agreed that the
technology
workshops were
satisfactory
Telephone survey
of students (N =
508)
Students were asked if they were satisfied with
the technology workshops.
Survey of students
in Miller Center (N
= 300)
Students were asked if the equipment in the
campus electronic classrooms (instructor
station, Internet connection, projector, etc.) is
beneficial and improves their learning.
87% agreed that the
technology
workshops were
satisfactory
88% agreed or
strongly agreed that
the e-classroom
technology
improved their
learning
Students in classes
that meet in eclassrooms will
report that the use
of the technology
improves their
learning and class
88% of students
who used the
Reference Desk
were satisfied; 38%
were aware of the
service but had not
used it
96% of users were
satisfied with
Reference Desk
assistance, though
35% were not aware
of the service
95% of students
who had asked for
help with D2L were
satisfied
84
performance.
Students who use
the Miller Center
will report that
library and
technology
resources and
services have helped
with their
assignments.
Students who use
the Miller Center
will report that
library and
technology
resources and
services have helped
with their
assignments.
Students who have
used the Miller
Center facility will
report overall
satisfaction with
their visits.
Survey of students
in Miller Center (N
= 300)
Students were asked if library and technology
resources and services have helped with their
assignments.
92% were satisfied
with ways in which
library and
technology services
have helped with
their assignments
Survey of students
in Miller Center (N
= 300)
Students were asked if library and technology
resources and services support their academic
learning
95% were satisfied
with the support
from library and
technology services
Survey of students
in Miller Center (N
= 300)
Students were asked why they had come to the
Miller Center on the day of the survey and
whether or not they were satisfied with their
visit.
93% were satisfied
with their visit to the
Miller Center the
day of the survey
Telephone survey
of students (N =
508)
Students were asked about their overall
satisfaction with Miller Center resources and
technology.
96% agreed or
strongly agreed
Plans for 2008-09
Tentative plans are listed below. These possible projects represent an attempt to move closer
toward direct measures of student learning related to use of LR&TS resources and
services. Using direct measures presents significant challenges to a unit that supports all
students, programs, and faculty at SCSU. Thus the projects are tentatively planned to be smallscale. The LR&TS Assessment Committee will complete the 2008-09 Assessment Plan later in
October.
The same outcomes will be evaluated next year in order to continue to build long-term data
for comparison.
Continue with two major surveys: in the Miller Center and by telephone through the
SCSU Survey Center. Self-reported awareness and satisfaction with library and
technology resources and services.
Students' bibliographies for a major research assignment in selected class(es) will demonstrate
their competence in using library resources to fulfill a specific research need.
Librarians and faculty in a few selected classes will collaborate to evaluate the quality of
the items students selected for major research projects. Tentative plan: Student
85
bibliographies in one or two classes that have used library instruction for disciplinespecific research will be analyzed by a librarian and the professor to determine if
students have used quality resources and have located appropriate research to support
their needs for the assignment.
Faculty members who use library instruction will be satisfied with the research their students
do on major research projects.
Librarians will design an assessment tool (such as embedded questions in a course
evaluation or test) to be completed by students who use library instruction. Faculty will
complete a different tool to indicate their satisfaction with students’ learning. Tentative
plan: The evaluation tool will be designed to determine the faculty member's
satisfaction with the quality of resources and the skill which students developed in using
library and other information resources.
Librarians will identify several student learning outcomes related to library resources.
Not sure yet where or how this data might be collected. We might use StudentVoice
hand-held computers for this project. Tentative plan: Reference and instruction
librarians will identify several measurable student learning outcomes that can be used
to assess students' growth and skill in using library resources.
Summary
In the years following LR&TS's implementation of formal assessment of student awareness of
and satisfaction with library and technology resources and services, the organization has made
consistent progress toward developing a culture of assessment. Assessment activities and data
seem to be valued and are used when appropriate to plan and implement changes. If SCSU
moves toward adopting campus-wide student learning outcomes, LR&TS is poised to strengthen
the connections to improve student learning that it has created and sustained.
86
87
Appendix G. Report on Student Life and Development
by James Knutson-Kolodzne
88
Student Life & Development Assessment Report
Prepared by James Knutson-Kolodzne
Submitted July 2008
The Division of Student Life and Development has established learning outcomes and translated
them into understandable outcomes for our students.
 A marketing goal was to disseminate these learning outcomes throughout the SLD
division as well as across campus.
Each department of the division is currently in the process of establishing departmental and
programmatic learning outcomes and identifying methods of assessment.
The 2007-2008 goals of the SLD assessment committee were to:
 develop a process for reviewing department assessment plans
 conduct an audit of the assessment skill level within the SLD division
 develop a training and professional development plan to help support the faculty and staff
skill sets needed to effectively develop learning outcomes
 purchase materials on assessment practices in Student Affairs
o Received a University Assessment grant and purchased a Student Affairs
Assessment Manual for each of the 14 divisions within SLD.
 nurture a transformation of an assessment culture through professional competency.
Essentially, the challenge is to provide support for the division to develop the competencies
inherent to a culture of assessment.
The division must plan proper support for competency development and resources to inspire
confidence and move the division forward with regard to assessment, measurement, and effective
methodologies.
It is our intent to secure resources to jumpstart the required competency development;
 identify needs and assess our competency as a division
o Developed and implemented a Needs Assessment Survey in Spring 08

provided ‘in house’ support to help each member of the division (In progress).
The division is eager to move forward, but has a limited number of personnel who are at a
sufficient level of competency. The motivation and expectation is present, but the need to
measure who needs what support and to provide competency development is critical to an ‘all
hands” effort. This; in turn, sets the expectation and movement to make assessment principles
and practices a mainstream part of all program entities in the SLD division and also provides
leadership in leading SCSU in a integrated learning direction for the benefit of the students.
89
Download